00:00:00.001 Started by upstream project "autotest-spdk-master-vs-dpdk-v23.11" build number 980 00:00:00.001 originally caused by: 00:00:00.001 Started by upstream project "nightly-trigger" build number 3642 00:00:00.001 originally caused by: 00:00:00.001 Started by timer 00:00:00.024 Checking out git https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool into /var/jenkins_home/workspace/short-fuzz-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4 to read jbp/jenkins/jjb-config/jobs/autotest-downstream/autotest-phy.groovy 00:00:00.024 The recommended git tool is: git 00:00:00.025 using credential 00000000-0000-0000-0000-000000000002 00:00:00.027 > git rev-parse --resolve-git-dir /var/jenkins_home/workspace/short-fuzz-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4/jbp/.git # timeout=10 00:00:00.046 Fetching changes from the remote Git repository 00:00:00.049 > git config remote.origin.url https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool # timeout=10 00:00:00.073 Using shallow fetch with depth 1 00:00:00.073 Fetching upstream changes from https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool 00:00:00.073 > git --version # timeout=10 00:00:00.112 > git --version # 'git version 2.39.2' 00:00:00.112 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:00.184 Setting http proxy: proxy-dmz.intel.com:911 00:00:00.184 > git fetch --tags --force --progress --depth=1 -- https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool refs/heads/master # timeout=5 00:00:03.084 > git rev-parse origin/FETCH_HEAD^{commit} # timeout=10 00:00:03.093 > git rev-parse FETCH_HEAD^{commit} # timeout=10 00:00:03.103 Checking out Revision db4637e8b949f278f369ec13f70585206ccd9507 (FETCH_HEAD) 00:00:03.103 > git config core.sparsecheckout # timeout=10 00:00:03.113 > git read-tree -mu HEAD # timeout=10 00:00:03.127 > git checkout -f db4637e8b949f278f369ec13f70585206ccd9507 # timeout=5 00:00:03.142 Commit message: "jenkins/jjb-config: Add missing SPDK_TEST_NVME_INTERRUPT flag" 00:00:03.142 > git rev-list --no-walk db4637e8b949f278f369ec13f70585206ccd9507 # timeout=10 00:00:03.221 [Pipeline] Start of Pipeline 00:00:03.236 [Pipeline] library 00:00:03.238 Loading library shm_lib@master 00:00:03.238 Library shm_lib@master is cached. Copying from home. 00:00:03.275 [Pipeline] node 00:00:03.297 Running on WFP20 in /var/jenkins/workspace/short-fuzz-phy-autotest 00:00:03.299 [Pipeline] { 00:00:03.306 [Pipeline] catchError 00:00:03.307 [Pipeline] { 00:00:03.315 [Pipeline] wrap 00:00:03.320 [Pipeline] { 00:00:03.325 [Pipeline] stage 00:00:03.326 [Pipeline] { (Prologue) 00:00:03.516 [Pipeline] sh 00:00:03.802 + logger -p user.info -t JENKINS-CI 00:00:03.820 [Pipeline] echo 00:00:03.822 Node: WFP20 00:00:03.829 [Pipeline] sh 00:00:04.132 [Pipeline] setCustomBuildProperty 00:00:04.144 [Pipeline] echo 00:00:04.145 Cleanup processes 00:00:04.151 [Pipeline] sh 00:00:04.436 + sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:04.436 178938 sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:04.448 [Pipeline] sh 00:00:04.733 ++ sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:04.733 ++ grep -v 'sudo pgrep' 00:00:04.733 ++ awk '{print $1}' 00:00:04.733 + sudo kill -9 00:00:04.733 + true 00:00:04.747 [Pipeline] cleanWs 00:00:04.756 [WS-CLEANUP] Deleting project workspace... 00:00:04.756 [WS-CLEANUP] Deferred wipeout is used... 00:00:04.763 [WS-CLEANUP] done 00:00:04.768 [Pipeline] setCustomBuildProperty 00:00:04.780 [Pipeline] sh 00:00:05.068 + sudo git config --global --replace-all safe.directory '*' 00:00:05.188 [Pipeline] httpRequest 00:00:05.891 [Pipeline] echo 00:00:05.893 Sorcerer 10.211.164.20 is alive 00:00:05.901 [Pipeline] retry 00:00:05.903 [Pipeline] { 00:00:05.915 [Pipeline] httpRequest 00:00:05.920 HttpMethod: GET 00:00:05.920 URL: http://10.211.164.20/packages/jbp_db4637e8b949f278f369ec13f70585206ccd9507.tar.gz 00:00:05.921 Sending request to url: http://10.211.164.20/packages/jbp_db4637e8b949f278f369ec13f70585206ccd9507.tar.gz 00:00:05.935 Response Code: HTTP/1.1 200 OK 00:00:05.935 Success: Status code 200 is in the accepted range: 200,404 00:00:05.935 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/jbp_db4637e8b949f278f369ec13f70585206ccd9507.tar.gz 00:00:08.902 [Pipeline] } 00:00:08.915 [Pipeline] // retry 00:00:08.921 [Pipeline] sh 00:00:09.217 + tar --no-same-owner -xf jbp_db4637e8b949f278f369ec13f70585206ccd9507.tar.gz 00:00:09.234 [Pipeline] httpRequest 00:00:09.606 [Pipeline] echo 00:00:09.608 Sorcerer 10.211.164.20 is alive 00:00:09.618 [Pipeline] retry 00:00:09.620 [Pipeline] { 00:00:09.634 [Pipeline] httpRequest 00:00:09.638 HttpMethod: GET 00:00:09.639 URL: http://10.211.164.20/packages/spdk_d47eb51c960b88a8c704cc184fd594dbc3abad70.tar.gz 00:00:09.640 Sending request to url: http://10.211.164.20/packages/spdk_d47eb51c960b88a8c704cc184fd594dbc3abad70.tar.gz 00:00:09.659 Response Code: HTTP/1.1 200 OK 00:00:09.659 Success: Status code 200 is in the accepted range: 200,404 00:00:09.660 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk_d47eb51c960b88a8c704cc184fd594dbc3abad70.tar.gz 00:01:44.550 [Pipeline] } 00:01:44.567 [Pipeline] // retry 00:01:44.574 [Pipeline] sh 00:01:44.868 + tar --no-same-owner -xf spdk_d47eb51c960b88a8c704cc184fd594dbc3abad70.tar.gz 00:01:47.426 [Pipeline] sh 00:01:47.718 + git -C spdk log --oneline -n5 00:01:47.718 d47eb51c9 bdev: fix a race between reset start and complete 00:01:47.718 83e8405e4 nvmf/fc: Qpair disconnect callback: Serialize FC delete connection & close qpair process 00:01:47.718 0eab4c6fb nvmf/fc: Validate the ctrlr pointer inside nvmf_fc_req_bdev_abort() 00:01:47.718 4bcab9fb9 correct kick for CQ full case 00:01:47.718 8531656d3 test/nvmf: Interrupt test for local pcie nvme device 00:01:47.737 [Pipeline] withCredentials 00:01:47.749 > git --version # timeout=10 00:01:47.763 > git --version # 'git version 2.39.2' 00:01:47.782 Masking supported pattern matches of $GIT_PASSWORD or $GIT_ASKPASS 00:01:47.784 [Pipeline] { 00:01:47.794 [Pipeline] retry 00:01:47.796 [Pipeline] { 00:01:47.812 [Pipeline] sh 00:01:48.101 + git ls-remote http://dpdk.org/git/dpdk-stable v23.11 00:01:48.114 [Pipeline] } 00:01:48.135 [Pipeline] // retry 00:01:48.140 [Pipeline] } 00:01:48.161 [Pipeline] // withCredentials 00:01:48.172 [Pipeline] httpRequest 00:01:48.642 [Pipeline] echo 00:01:48.644 Sorcerer 10.211.164.20 is alive 00:01:48.655 [Pipeline] retry 00:01:48.657 [Pipeline] { 00:01:48.671 [Pipeline] httpRequest 00:01:48.676 HttpMethod: GET 00:01:48.677 URL: http://10.211.164.20/packages/dpdk_d15625009dced269fcec27fc81dd74fd58d54cdb.tar.gz 00:01:48.678 Sending request to url: http://10.211.164.20/packages/dpdk_d15625009dced269fcec27fc81dd74fd58d54cdb.tar.gz 00:01:48.691 Response Code: HTTP/1.1 200 OK 00:01:48.691 Success: Status code 200 is in the accepted range: 200,404 00:01:48.692 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk_d15625009dced269fcec27fc81dd74fd58d54cdb.tar.gz 00:01:59.249 [Pipeline] } 00:01:59.269 [Pipeline] // retry 00:01:59.277 [Pipeline] sh 00:01:59.569 + tar --no-same-owner -xf dpdk_d15625009dced269fcec27fc81dd74fd58d54cdb.tar.gz 00:02:00.965 [Pipeline] sh 00:02:01.253 + git -C dpdk log --oneline -n5 00:02:01.253 eeb0605f11 version: 23.11.0 00:02:01.253 238778122a doc: update release notes for 23.11 00:02:01.253 46aa6b3cfc doc: fix description of RSS features 00:02:01.253 dd88f51a57 devtools: forbid DPDK API in cnxk base driver 00:02:01.253 7e421ae345 devtools: support skipping forbid rule check 00:02:01.264 [Pipeline] } 00:02:01.278 [Pipeline] // stage 00:02:01.287 [Pipeline] stage 00:02:01.289 [Pipeline] { (Prepare) 00:02:01.309 [Pipeline] writeFile 00:02:01.324 [Pipeline] sh 00:02:01.612 + logger -p user.info -t JENKINS-CI 00:02:01.626 [Pipeline] sh 00:02:01.914 + logger -p user.info -t JENKINS-CI 00:02:01.927 [Pipeline] sh 00:02:02.215 + cat autorun-spdk.conf 00:02:02.215 SPDK_RUN_FUNCTIONAL_TEST=1 00:02:02.215 SPDK_TEST_FUZZER_SHORT=1 00:02:02.215 SPDK_TEST_FUZZER=1 00:02:02.215 SPDK_TEST_SETUP=1 00:02:02.215 SPDK_RUN_UBSAN=1 00:02:02.215 SPDK_TEST_NATIVE_DPDK=v23.11 00:02:02.215 SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:02:02.224 RUN_NIGHTLY=1 00:02:02.229 [Pipeline] readFile 00:02:02.254 [Pipeline] withEnv 00:02:02.256 [Pipeline] { 00:02:02.268 [Pipeline] sh 00:02:02.558 + set -ex 00:02:02.558 + [[ -f /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf ]] 00:02:02.558 + source /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:02:02.558 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:02:02.558 ++ SPDK_TEST_FUZZER_SHORT=1 00:02:02.558 ++ SPDK_TEST_FUZZER=1 00:02:02.558 ++ SPDK_TEST_SETUP=1 00:02:02.558 ++ SPDK_RUN_UBSAN=1 00:02:02.558 ++ SPDK_TEST_NATIVE_DPDK=v23.11 00:02:02.558 ++ SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:02:02.558 ++ RUN_NIGHTLY=1 00:02:02.558 + case $SPDK_TEST_NVMF_NICS in 00:02:02.558 + DRIVERS= 00:02:02.558 + [[ -n '' ]] 00:02:02.558 + exit 0 00:02:02.568 [Pipeline] } 00:02:02.582 [Pipeline] // withEnv 00:02:02.588 [Pipeline] } 00:02:02.602 [Pipeline] // stage 00:02:02.611 [Pipeline] catchError 00:02:02.613 [Pipeline] { 00:02:02.627 [Pipeline] timeout 00:02:02.627 Timeout set to expire in 30 min 00:02:02.629 [Pipeline] { 00:02:02.643 [Pipeline] stage 00:02:02.645 [Pipeline] { (Tests) 00:02:02.659 [Pipeline] sh 00:02:02.950 + jbp/jenkins/jjb-config/jobs/scripts/autoruner.sh /var/jenkins/workspace/short-fuzz-phy-autotest 00:02:02.950 ++ readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest 00:02:02.950 + DIR_ROOT=/var/jenkins/workspace/short-fuzz-phy-autotest 00:02:02.950 + [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest ]] 00:02:02.950 + DIR_SPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:02:02.950 + DIR_OUTPUT=/var/jenkins/workspace/short-fuzz-phy-autotest/output 00:02:02.950 + [[ -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk ]] 00:02:02.950 + [[ ! -d /var/jenkins/workspace/short-fuzz-phy-autotest/output ]] 00:02:02.950 + mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/output 00:02:02.950 + [[ -d /var/jenkins/workspace/short-fuzz-phy-autotest/output ]] 00:02:02.950 + [[ short-fuzz-phy-autotest == pkgdep-* ]] 00:02:02.950 + cd /var/jenkins/workspace/short-fuzz-phy-autotest 00:02:02.950 + source /etc/os-release 00:02:02.950 ++ NAME='Fedora Linux' 00:02:02.950 ++ VERSION='39 (Cloud Edition)' 00:02:02.950 ++ ID=fedora 00:02:02.950 ++ VERSION_ID=39 00:02:02.950 ++ VERSION_CODENAME= 00:02:02.950 ++ PLATFORM_ID=platform:f39 00:02:02.950 ++ PRETTY_NAME='Fedora Linux 39 (Cloud Edition)' 00:02:02.950 ++ ANSI_COLOR='0;38;2;60;110;180' 00:02:02.950 ++ LOGO=fedora-logo-icon 00:02:02.950 ++ CPE_NAME=cpe:/o:fedoraproject:fedora:39 00:02:02.950 ++ HOME_URL=https://fedoraproject.org/ 00:02:02.950 ++ DOCUMENTATION_URL=https://docs.fedoraproject.org/en-US/fedora/f39/system-administrators-guide/ 00:02:02.950 ++ SUPPORT_URL=https://ask.fedoraproject.org/ 00:02:02.950 ++ BUG_REPORT_URL=https://bugzilla.redhat.com/ 00:02:02.950 ++ REDHAT_BUGZILLA_PRODUCT=Fedora 00:02:02.950 ++ REDHAT_BUGZILLA_PRODUCT_VERSION=39 00:02:02.950 ++ REDHAT_SUPPORT_PRODUCT=Fedora 00:02:02.950 ++ REDHAT_SUPPORT_PRODUCT_VERSION=39 00:02:02.950 ++ SUPPORT_END=2024-11-12 00:02:02.950 ++ VARIANT='Cloud Edition' 00:02:02.950 ++ VARIANT_ID=cloud 00:02:02.950 + uname -a 00:02:02.950 Linux spdk-wfp-20 6.8.9-200.fc39.x86_64 #1 SMP PREEMPT_DYNAMIC Wed Jul 24 03:04:40 UTC 2024 x86_64 GNU/Linux 00:02:02.950 + sudo /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:02:06.255 Hugepages 00:02:06.255 node hugesize free / total 00:02:06.255 node0 1048576kB 0 / 0 00:02:06.255 node0 2048kB 0 / 0 00:02:06.255 node1 1048576kB 0 / 0 00:02:06.255 node1 2048kB 0 / 0 00:02:06.255 00:02:06.255 Type BDF Vendor Device NUMA Driver Device Block devices 00:02:06.255 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:02:06.255 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:02:06.255 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:02:06.255 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:02:06.255 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:02:06.255 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:02:06.255 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:02:06.255 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:02:06.255 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:02:06.255 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:02:06.255 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:02:06.255 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:02:06.255 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:02:06.255 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:02:06.255 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:02:06.255 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:02:06.255 NVMe 0000:d8:00.0 8086 0a54 1 nvme nvme0 nvme0n1 00:02:06.255 + rm -f /tmp/spdk-ld-path 00:02:06.255 + source autorun-spdk.conf 00:02:06.255 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:02:06.255 ++ SPDK_TEST_FUZZER_SHORT=1 00:02:06.255 ++ SPDK_TEST_FUZZER=1 00:02:06.255 ++ SPDK_TEST_SETUP=1 00:02:06.255 ++ SPDK_RUN_UBSAN=1 00:02:06.255 ++ SPDK_TEST_NATIVE_DPDK=v23.11 00:02:06.255 ++ SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:02:06.255 ++ RUN_NIGHTLY=1 00:02:06.255 + (( SPDK_TEST_NVME_CMB == 1 || SPDK_TEST_NVME_PMR == 1 )) 00:02:06.255 + [[ -n '' ]] 00:02:06.255 + sudo git config --global --add safe.directory /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:02:06.255 + for M in /var/spdk/build-*-manifest.txt 00:02:06.255 + [[ -f /var/spdk/build-kernel-manifest.txt ]] 00:02:06.255 + cp /var/spdk/build-kernel-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:02:06.255 + for M in /var/spdk/build-*-manifest.txt 00:02:06.255 + [[ -f /var/spdk/build-pkg-manifest.txt ]] 00:02:06.255 + cp /var/spdk/build-pkg-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:02:06.255 + for M in /var/spdk/build-*-manifest.txt 00:02:06.255 + [[ -f /var/spdk/build-repo-manifest.txt ]] 00:02:06.255 + cp /var/spdk/build-repo-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:02:06.255 ++ uname 00:02:06.255 + [[ Linux == \L\i\n\u\x ]] 00:02:06.255 + sudo dmesg -T 00:02:06.255 + sudo dmesg --clear 00:02:06.255 + dmesg_pid=179992 00:02:06.255 + [[ Fedora Linux == FreeBSD ]] 00:02:06.255 + export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:02:06.255 + UNBIND_ENTIRE_IOMMU_GROUP=yes 00:02:06.255 + [[ -e /var/spdk/dependencies/vhost/spdk_test_image.qcow2 ]] 00:02:06.255 + [[ -x /usr/src/fio-static/fio ]] 00:02:06.255 + export FIO_BIN=/usr/src/fio-static/fio 00:02:06.255 + FIO_BIN=/usr/src/fio-static/fio 00:02:06.255 + sudo dmesg -Tw 00:02:06.255 + [[ '' == \/\v\a\r\/\j\e\n\k\i\n\s\/\w\o\r\k\s\p\a\c\e\/\s\h\o\r\t\-\f\u\z\z\-\p\h\y\-\a\u\t\o\t\e\s\t\/\q\e\m\u\_\v\f\i\o\/* ]] 00:02:06.255 + [[ ! -v VFIO_QEMU_BIN ]] 00:02:06.255 + [[ -e /usr/local/qemu/vfio-user-latest ]] 00:02:06.255 + export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:02:06.255 + VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:02:06.255 + [[ -e /usr/local/qemu/vanilla-latest ]] 00:02:06.255 + export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:02:06.255 + QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:02:06.255 + spdk/autorun.sh /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:02:06.255 14:15:02 -- common/autotest_common.sh@1692 -- $ [[ n == y ]] 00:02:06.255 14:15:02 -- spdk/autorun.sh@20 -- $ source /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:02:06.255 14:15:02 -- short-fuzz-phy-autotest/autorun-spdk.conf@1 -- $ SPDK_RUN_FUNCTIONAL_TEST=1 00:02:06.255 14:15:02 -- short-fuzz-phy-autotest/autorun-spdk.conf@2 -- $ SPDK_TEST_FUZZER_SHORT=1 00:02:06.255 14:15:02 -- short-fuzz-phy-autotest/autorun-spdk.conf@3 -- $ SPDK_TEST_FUZZER=1 00:02:06.255 14:15:02 -- short-fuzz-phy-autotest/autorun-spdk.conf@4 -- $ SPDK_TEST_SETUP=1 00:02:06.255 14:15:02 -- short-fuzz-phy-autotest/autorun-spdk.conf@5 -- $ SPDK_RUN_UBSAN=1 00:02:06.255 14:15:02 -- short-fuzz-phy-autotest/autorun-spdk.conf@6 -- $ SPDK_TEST_NATIVE_DPDK=v23.11 00:02:06.255 14:15:02 -- short-fuzz-phy-autotest/autorun-spdk.conf@7 -- $ SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:02:06.255 14:15:02 -- short-fuzz-phy-autotest/autorun-spdk.conf@8 -- $ RUN_NIGHTLY=1 00:02:06.255 14:15:02 -- spdk/autorun.sh@22 -- $ trap 'timing_finish || exit 1' EXIT 00:02:06.255 14:15:02 -- spdk/autorun.sh@25 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/autobuild.sh /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:02:06.517 14:15:02 -- common/autotest_common.sh@1692 -- $ [[ n == y ]] 00:02:06.517 14:15:02 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:02:06.517 14:15:02 -- scripts/common.sh@15 -- $ shopt -s extglob 00:02:06.517 14:15:02 -- scripts/common.sh@544 -- $ [[ -e /bin/wpdk_common.sh ]] 00:02:06.517 14:15:02 -- scripts/common.sh@552 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:02:06.517 14:15:02 -- scripts/common.sh@553 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:02:06.517 14:15:02 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:06.517 14:15:02 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:06.517 14:15:02 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:06.517 14:15:02 -- paths/export.sh@5 -- $ export PATH 00:02:06.517 14:15:02 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:06.517 14:15:02 -- common/autobuild_common.sh@485 -- $ out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:02:06.517 14:15:02 -- common/autobuild_common.sh@486 -- $ date +%s 00:02:06.517 14:15:02 -- common/autobuild_common.sh@486 -- $ mktemp -dt spdk_1731935702.XXXXXX 00:02:06.517 14:15:02 -- common/autobuild_common.sh@486 -- $ SPDK_WORKSPACE=/tmp/spdk_1731935702.adrxQW 00:02:06.517 14:15:02 -- common/autobuild_common.sh@488 -- $ [[ -n '' ]] 00:02:06.517 14:15:02 -- common/autobuild_common.sh@492 -- $ '[' -n v23.11 ']' 00:02:06.517 14:15:02 -- common/autobuild_common.sh@493 -- $ dirname /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:02:06.517 14:15:02 -- common/autobuild_common.sh@493 -- $ scanbuild_exclude=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk' 00:02:06.517 14:15:02 -- common/autobuild_common.sh@499 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp' 00:02:06.517 14:15:02 -- common/autobuild_common.sh@501 -- $ scanbuild='scan-build -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:02:06.517 14:15:02 -- common/autobuild_common.sh@502 -- $ get_config_params 00:02:06.517 14:15:02 -- common/autotest_common.sh@409 -- $ xtrace_disable 00:02:06.517 14:15:02 -- common/autotest_common.sh@10 -- $ set +x 00:02:06.517 14:15:02 -- common/autobuild_common.sh@502 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user' 00:02:06.517 14:15:02 -- common/autobuild_common.sh@504 -- $ start_monitor_resources 00:02:06.517 14:15:02 -- pm/common@17 -- $ local monitor 00:02:06.517 14:15:02 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:06.517 14:15:02 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:06.517 14:15:02 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:06.517 14:15:02 -- pm/common@21 -- $ date +%s 00:02:06.517 14:15:02 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:06.517 14:15:02 -- pm/common@21 -- $ date +%s 00:02:06.517 14:15:02 -- pm/common@25 -- $ sleep 1 00:02:06.517 14:15:02 -- pm/common@21 -- $ date +%s 00:02:06.517 14:15:02 -- pm/common@21 -- $ date +%s 00:02:06.517 14:15:02 -- pm/common@21 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1731935702 00:02:06.517 14:15:02 -- pm/common@21 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1731935702 00:02:06.517 14:15:02 -- pm/common@21 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1731935702 00:02:06.517 14:15:02 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1731935702 00:02:06.517 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1731935702_collect-cpu-load.pm.log 00:02:06.517 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1731935702_collect-vmstat.pm.log 00:02:06.517 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1731935702_collect-cpu-temp.pm.log 00:02:06.517 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1731935702_collect-bmc-pm.bmc.pm.log 00:02:07.459 14:15:03 -- common/autobuild_common.sh@505 -- $ trap stop_monitor_resources EXIT 00:02:07.459 14:15:03 -- spdk/autobuild.sh@11 -- $ SPDK_TEST_AUTOBUILD= 00:02:07.459 14:15:03 -- spdk/autobuild.sh@12 -- $ umask 022 00:02:07.459 14:15:03 -- spdk/autobuild.sh@13 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:02:07.459 14:15:03 -- spdk/autobuild.sh@16 -- $ date -u 00:02:07.459 Mon Nov 18 01:15:03 PM UTC 2024 00:02:07.459 14:15:03 -- spdk/autobuild.sh@17 -- $ git describe --tags 00:02:07.459 v25.01-pre-190-gd47eb51c9 00:02:07.459 14:15:03 -- spdk/autobuild.sh@19 -- $ '[' 0 -eq 1 ']' 00:02:07.459 14:15:03 -- spdk/autobuild.sh@23 -- $ '[' 1 -eq 1 ']' 00:02:07.459 14:15:03 -- spdk/autobuild.sh@24 -- $ run_test ubsan echo 'using ubsan' 00:02:07.459 14:15:03 -- common/autotest_common.sh@1105 -- $ '[' 3 -le 1 ']' 00:02:07.459 14:15:03 -- common/autotest_common.sh@1111 -- $ xtrace_disable 00:02:07.459 14:15:03 -- common/autotest_common.sh@10 -- $ set +x 00:02:07.459 ************************************ 00:02:07.459 START TEST ubsan 00:02:07.459 ************************************ 00:02:07.459 14:15:03 ubsan -- common/autotest_common.sh@1129 -- $ echo 'using ubsan' 00:02:07.459 using ubsan 00:02:07.459 00:02:07.459 real 0m0.001s 00:02:07.459 user 0m0.000s 00:02:07.459 sys 0m0.000s 00:02:07.459 14:15:03 ubsan -- common/autotest_common.sh@1130 -- $ xtrace_disable 00:02:07.459 14:15:03 ubsan -- common/autotest_common.sh@10 -- $ set +x 00:02:07.459 ************************************ 00:02:07.459 END TEST ubsan 00:02:07.459 ************************************ 00:02:07.723 14:15:03 -- spdk/autobuild.sh@27 -- $ '[' -n v23.11 ']' 00:02:07.723 14:15:03 -- spdk/autobuild.sh@28 -- $ build_native_dpdk 00:02:07.723 14:15:03 -- common/autobuild_common.sh@442 -- $ run_test build_native_dpdk _build_native_dpdk 00:02:07.723 14:15:03 -- common/autotest_common.sh@1105 -- $ '[' 2 -le 1 ']' 00:02:07.723 14:15:03 -- common/autotest_common.sh@1111 -- $ xtrace_disable 00:02:07.723 14:15:03 -- common/autotest_common.sh@10 -- $ set +x 00:02:07.723 ************************************ 00:02:07.723 START TEST build_native_dpdk 00:02:07.723 ************************************ 00:02:07.723 14:15:03 build_native_dpdk -- common/autotest_common.sh@1129 -- $ _build_native_dpdk 00:02:07.723 14:15:03 build_native_dpdk -- common/autobuild_common.sh@48 -- $ local external_dpdk_dir 00:02:07.723 14:15:03 build_native_dpdk -- common/autobuild_common.sh@49 -- $ local external_dpdk_base_dir 00:02:07.723 14:15:03 build_native_dpdk -- common/autobuild_common.sh@50 -- $ local compiler_version 00:02:07.723 14:15:03 build_native_dpdk -- common/autobuild_common.sh@51 -- $ local compiler 00:02:07.723 14:15:03 build_native_dpdk -- common/autobuild_common.sh@52 -- $ local dpdk_kmods 00:02:07.723 14:15:03 build_native_dpdk -- common/autobuild_common.sh@53 -- $ local repo=dpdk 00:02:07.723 14:15:03 build_native_dpdk -- common/autobuild_common.sh@55 -- $ compiler=gcc 00:02:07.723 14:15:03 build_native_dpdk -- common/autobuild_common.sh@61 -- $ export CC=gcc 00:02:07.723 14:15:03 build_native_dpdk -- common/autobuild_common.sh@61 -- $ CC=gcc 00:02:07.723 14:15:03 build_native_dpdk -- common/autobuild_common.sh@63 -- $ [[ gcc != *clang* ]] 00:02:07.723 14:15:03 build_native_dpdk -- common/autobuild_common.sh@63 -- $ [[ gcc != *gcc* ]] 00:02:07.723 14:15:03 build_native_dpdk -- common/autobuild_common.sh@68 -- $ gcc -dumpversion 00:02:07.723 14:15:03 build_native_dpdk -- common/autobuild_common.sh@68 -- $ compiler_version=13 00:02:07.723 14:15:03 build_native_dpdk -- common/autobuild_common.sh@69 -- $ compiler_version=13 00:02:07.723 14:15:03 build_native_dpdk -- common/autobuild_common.sh@70 -- $ external_dpdk_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:02:07.723 14:15:03 build_native_dpdk -- common/autobuild_common.sh@71 -- $ dirname /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:02:07.723 14:15:03 build_native_dpdk -- common/autobuild_common.sh@71 -- $ external_dpdk_base_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk 00:02:07.723 14:15:03 build_native_dpdk -- common/autobuild_common.sh@73 -- $ [[ ! -d /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk ]] 00:02:07.723 14:15:03 build_native_dpdk -- common/autobuild_common.sh@82 -- $ orgdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:02:07.723 14:15:03 build_native_dpdk -- common/autobuild_common.sh@83 -- $ git -C /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk log --oneline -n 5 00:02:07.723 eeb0605f11 version: 23.11.0 00:02:07.723 238778122a doc: update release notes for 23.11 00:02:07.723 46aa6b3cfc doc: fix description of RSS features 00:02:07.723 dd88f51a57 devtools: forbid DPDK API in cnxk base driver 00:02:07.723 7e421ae345 devtools: support skipping forbid rule check 00:02:07.723 14:15:03 build_native_dpdk -- common/autobuild_common.sh@85 -- $ dpdk_cflags='-fPIC -g -fcommon' 00:02:07.723 14:15:03 build_native_dpdk -- common/autobuild_common.sh@86 -- $ dpdk_ldflags= 00:02:07.723 14:15:03 build_native_dpdk -- common/autobuild_common.sh@87 -- $ dpdk_ver=23.11.0 00:02:07.723 14:15:03 build_native_dpdk -- common/autobuild_common.sh@89 -- $ [[ gcc == *gcc* ]] 00:02:07.723 14:15:03 build_native_dpdk -- common/autobuild_common.sh@89 -- $ [[ 13 -ge 5 ]] 00:02:07.723 14:15:03 build_native_dpdk -- common/autobuild_common.sh@90 -- $ dpdk_cflags+=' -Werror' 00:02:07.723 14:15:03 build_native_dpdk -- common/autobuild_common.sh@93 -- $ [[ gcc == *gcc* ]] 00:02:07.723 14:15:03 build_native_dpdk -- common/autobuild_common.sh@93 -- $ [[ 13 -ge 10 ]] 00:02:07.723 14:15:03 build_native_dpdk -- common/autobuild_common.sh@94 -- $ dpdk_cflags+=' -Wno-stringop-overflow' 00:02:07.723 14:15:03 build_native_dpdk -- common/autobuild_common.sh@100 -- $ DPDK_DRIVERS=("bus" "bus/pci" "bus/vdev" "mempool/ring" "net/i40e" "net/i40e/base") 00:02:07.723 14:15:03 build_native_dpdk -- common/autobuild_common.sh@102 -- $ local mlx5_libs_added=n 00:02:07.723 14:15:03 build_native_dpdk -- common/autobuild_common.sh@103 -- $ [[ 0 -eq 1 ]] 00:02:07.723 14:15:03 build_native_dpdk -- common/autobuild_common.sh@103 -- $ [[ 0 -eq 1 ]] 00:02:07.723 14:15:03 build_native_dpdk -- common/autobuild_common.sh@139 -- $ [[ 0 -eq 1 ]] 00:02:07.723 14:15:03 build_native_dpdk -- common/autobuild_common.sh@167 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk 00:02:07.723 14:15:03 build_native_dpdk -- common/autobuild_common.sh@168 -- $ uname -s 00:02:07.723 14:15:03 build_native_dpdk -- common/autobuild_common.sh@168 -- $ '[' Linux = Linux ']' 00:02:07.723 14:15:03 build_native_dpdk -- common/autobuild_common.sh@169 -- $ lt 23.11.0 21.11.0 00:02:07.723 14:15:03 build_native_dpdk -- scripts/common.sh@373 -- $ cmp_versions 23.11.0 '<' 21.11.0 00:02:07.723 14:15:03 build_native_dpdk -- scripts/common.sh@333 -- $ local ver1 ver1_l 00:02:07.723 14:15:03 build_native_dpdk -- scripts/common.sh@334 -- $ local ver2 ver2_l 00:02:07.723 14:15:03 build_native_dpdk -- scripts/common.sh@336 -- $ IFS=.-: 00:02:07.723 14:15:03 build_native_dpdk -- scripts/common.sh@336 -- $ read -ra ver1 00:02:07.723 14:15:03 build_native_dpdk -- scripts/common.sh@337 -- $ IFS=.-: 00:02:07.723 14:15:03 build_native_dpdk -- scripts/common.sh@337 -- $ read -ra ver2 00:02:07.723 14:15:03 build_native_dpdk -- scripts/common.sh@338 -- $ local 'op=<' 00:02:07.723 14:15:03 build_native_dpdk -- scripts/common.sh@340 -- $ ver1_l=3 00:02:07.723 14:15:03 build_native_dpdk -- scripts/common.sh@341 -- $ ver2_l=3 00:02:07.723 14:15:03 build_native_dpdk -- scripts/common.sh@343 -- $ local lt=0 gt=0 eq=0 v 00:02:07.724 14:15:03 build_native_dpdk -- scripts/common.sh@344 -- $ case "$op" in 00:02:07.724 14:15:03 build_native_dpdk -- scripts/common.sh@345 -- $ : 1 00:02:07.724 14:15:03 build_native_dpdk -- scripts/common.sh@364 -- $ (( v = 0 )) 00:02:07.724 14:15:03 build_native_dpdk -- scripts/common.sh@364 -- $ (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:02:07.724 14:15:03 build_native_dpdk -- scripts/common.sh@365 -- $ decimal 23 00:02:07.724 14:15:03 build_native_dpdk -- scripts/common.sh@353 -- $ local d=23 00:02:07.724 14:15:03 build_native_dpdk -- scripts/common.sh@354 -- $ [[ 23 =~ ^[0-9]+$ ]] 00:02:07.724 14:15:03 build_native_dpdk -- scripts/common.sh@355 -- $ echo 23 00:02:07.724 14:15:03 build_native_dpdk -- scripts/common.sh@365 -- $ ver1[v]=23 00:02:07.724 14:15:03 build_native_dpdk -- scripts/common.sh@366 -- $ decimal 21 00:02:07.724 14:15:03 build_native_dpdk -- scripts/common.sh@353 -- $ local d=21 00:02:07.724 14:15:03 build_native_dpdk -- scripts/common.sh@354 -- $ [[ 21 =~ ^[0-9]+$ ]] 00:02:07.724 14:15:03 build_native_dpdk -- scripts/common.sh@355 -- $ echo 21 00:02:07.724 14:15:03 build_native_dpdk -- scripts/common.sh@366 -- $ ver2[v]=21 00:02:07.724 14:15:03 build_native_dpdk -- scripts/common.sh@367 -- $ (( ver1[v] > ver2[v] )) 00:02:07.724 14:15:03 build_native_dpdk -- scripts/common.sh@367 -- $ return 1 00:02:07.724 14:15:03 build_native_dpdk -- common/autobuild_common.sh@173 -- $ patch -p1 00:02:07.724 patching file config/rte_config.h 00:02:07.724 Hunk #1 succeeded at 60 (offset 1 line). 00:02:07.724 14:15:03 build_native_dpdk -- common/autobuild_common.sh@176 -- $ lt 23.11.0 24.07.0 00:02:07.724 14:15:03 build_native_dpdk -- scripts/common.sh@373 -- $ cmp_versions 23.11.0 '<' 24.07.0 00:02:07.724 14:15:03 build_native_dpdk -- scripts/common.sh@333 -- $ local ver1 ver1_l 00:02:07.724 14:15:03 build_native_dpdk -- scripts/common.sh@334 -- $ local ver2 ver2_l 00:02:07.724 14:15:03 build_native_dpdk -- scripts/common.sh@336 -- $ IFS=.-: 00:02:07.724 14:15:03 build_native_dpdk -- scripts/common.sh@336 -- $ read -ra ver1 00:02:07.724 14:15:03 build_native_dpdk -- scripts/common.sh@337 -- $ IFS=.-: 00:02:07.724 14:15:03 build_native_dpdk -- scripts/common.sh@337 -- $ read -ra ver2 00:02:07.724 14:15:03 build_native_dpdk -- scripts/common.sh@338 -- $ local 'op=<' 00:02:07.724 14:15:03 build_native_dpdk -- scripts/common.sh@340 -- $ ver1_l=3 00:02:07.724 14:15:03 build_native_dpdk -- scripts/common.sh@341 -- $ ver2_l=3 00:02:07.724 14:15:03 build_native_dpdk -- scripts/common.sh@343 -- $ local lt=0 gt=0 eq=0 v 00:02:07.724 14:15:03 build_native_dpdk -- scripts/common.sh@344 -- $ case "$op" in 00:02:07.724 14:15:03 build_native_dpdk -- scripts/common.sh@345 -- $ : 1 00:02:07.724 14:15:03 build_native_dpdk -- scripts/common.sh@364 -- $ (( v = 0 )) 00:02:07.724 14:15:03 build_native_dpdk -- scripts/common.sh@364 -- $ (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:02:07.724 14:15:03 build_native_dpdk -- scripts/common.sh@365 -- $ decimal 23 00:02:07.724 14:15:03 build_native_dpdk -- scripts/common.sh@353 -- $ local d=23 00:02:07.724 14:15:03 build_native_dpdk -- scripts/common.sh@354 -- $ [[ 23 =~ ^[0-9]+$ ]] 00:02:07.724 14:15:03 build_native_dpdk -- scripts/common.sh@355 -- $ echo 23 00:02:07.724 14:15:03 build_native_dpdk -- scripts/common.sh@365 -- $ ver1[v]=23 00:02:07.724 14:15:03 build_native_dpdk -- scripts/common.sh@366 -- $ decimal 24 00:02:07.724 14:15:03 build_native_dpdk -- scripts/common.sh@353 -- $ local d=24 00:02:07.724 14:15:03 build_native_dpdk -- scripts/common.sh@354 -- $ [[ 24 =~ ^[0-9]+$ ]] 00:02:07.724 14:15:03 build_native_dpdk -- scripts/common.sh@355 -- $ echo 24 00:02:07.724 14:15:03 build_native_dpdk -- scripts/common.sh@366 -- $ ver2[v]=24 00:02:07.724 14:15:03 build_native_dpdk -- scripts/common.sh@367 -- $ (( ver1[v] > ver2[v] )) 00:02:07.724 14:15:03 build_native_dpdk -- scripts/common.sh@368 -- $ (( ver1[v] < ver2[v] )) 00:02:07.724 14:15:03 build_native_dpdk -- scripts/common.sh@368 -- $ return 0 00:02:07.724 14:15:03 build_native_dpdk -- common/autobuild_common.sh@177 -- $ patch -p1 00:02:07.724 patching file lib/pcapng/rte_pcapng.c 00:02:07.724 14:15:03 build_native_dpdk -- common/autobuild_common.sh@179 -- $ ge 23.11.0 24.07.0 00:02:07.724 14:15:03 build_native_dpdk -- scripts/common.sh@376 -- $ cmp_versions 23.11.0 '>=' 24.07.0 00:02:07.724 14:15:03 build_native_dpdk -- scripts/common.sh@333 -- $ local ver1 ver1_l 00:02:07.724 14:15:03 build_native_dpdk -- scripts/common.sh@334 -- $ local ver2 ver2_l 00:02:07.724 14:15:03 build_native_dpdk -- scripts/common.sh@336 -- $ IFS=.-: 00:02:07.724 14:15:03 build_native_dpdk -- scripts/common.sh@336 -- $ read -ra ver1 00:02:07.724 14:15:03 build_native_dpdk -- scripts/common.sh@337 -- $ IFS=.-: 00:02:07.724 14:15:03 build_native_dpdk -- scripts/common.sh@337 -- $ read -ra ver2 00:02:07.724 14:15:03 build_native_dpdk -- scripts/common.sh@338 -- $ local 'op=>=' 00:02:07.724 14:15:03 build_native_dpdk -- scripts/common.sh@340 -- $ ver1_l=3 00:02:07.724 14:15:03 build_native_dpdk -- scripts/common.sh@341 -- $ ver2_l=3 00:02:07.724 14:15:03 build_native_dpdk -- scripts/common.sh@343 -- $ local lt=0 gt=0 eq=0 v 00:02:07.724 14:15:03 build_native_dpdk -- scripts/common.sh@344 -- $ case "$op" in 00:02:07.724 14:15:03 build_native_dpdk -- scripts/common.sh@348 -- $ : 1 00:02:07.724 14:15:03 build_native_dpdk -- scripts/common.sh@364 -- $ (( v = 0 )) 00:02:07.724 14:15:03 build_native_dpdk -- scripts/common.sh@364 -- $ (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:02:07.724 14:15:03 build_native_dpdk -- scripts/common.sh@365 -- $ decimal 23 00:02:07.724 14:15:03 build_native_dpdk -- scripts/common.sh@353 -- $ local d=23 00:02:07.724 14:15:03 build_native_dpdk -- scripts/common.sh@354 -- $ [[ 23 =~ ^[0-9]+$ ]] 00:02:07.724 14:15:03 build_native_dpdk -- scripts/common.sh@355 -- $ echo 23 00:02:07.724 14:15:03 build_native_dpdk -- scripts/common.sh@365 -- $ ver1[v]=23 00:02:07.724 14:15:03 build_native_dpdk -- scripts/common.sh@366 -- $ decimal 24 00:02:07.724 14:15:03 build_native_dpdk -- scripts/common.sh@353 -- $ local d=24 00:02:07.724 14:15:03 build_native_dpdk -- scripts/common.sh@354 -- $ [[ 24 =~ ^[0-9]+$ ]] 00:02:07.724 14:15:03 build_native_dpdk -- scripts/common.sh@355 -- $ echo 24 00:02:07.724 14:15:03 build_native_dpdk -- scripts/common.sh@366 -- $ ver2[v]=24 00:02:07.724 14:15:03 build_native_dpdk -- scripts/common.sh@367 -- $ (( ver1[v] > ver2[v] )) 00:02:07.724 14:15:03 build_native_dpdk -- scripts/common.sh@368 -- $ (( ver1[v] < ver2[v] )) 00:02:07.724 14:15:03 build_native_dpdk -- scripts/common.sh@368 -- $ return 1 00:02:07.724 14:15:03 build_native_dpdk -- common/autobuild_common.sh@183 -- $ dpdk_kmods=false 00:02:07.724 14:15:03 build_native_dpdk -- common/autobuild_common.sh@184 -- $ uname -s 00:02:07.724 14:15:03 build_native_dpdk -- common/autobuild_common.sh@184 -- $ '[' Linux = FreeBSD ']' 00:02:07.724 14:15:03 build_native_dpdk -- common/autobuild_common.sh@188 -- $ printf %s, bus bus/pci bus/vdev mempool/ring net/i40e net/i40e/base 00:02:07.724 14:15:03 build_native_dpdk -- common/autobuild_common.sh@188 -- $ meson build-tmp --prefix=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --libdir lib -Denable_docs=false -Denable_kmods=false -Dtests=false -Dc_link_args= '-Dc_args=-fPIC -g -fcommon -Werror -Wno-stringop-overflow' -Dmachine=native -Denable_drivers=bus,bus/pci,bus/vdev,mempool/ring,net/i40e,net/i40e/base, 00:02:13.009 The Meson build system 00:02:13.009 Version: 1.5.0 00:02:13.009 Source dir: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk 00:02:13.009 Build dir: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp 00:02:13.009 Build type: native build 00:02:13.009 Program cat found: YES (/usr/bin/cat) 00:02:13.009 Project name: DPDK 00:02:13.009 Project version: 23.11.0 00:02:13.009 C compiler for the host machine: gcc (gcc 13.3.1 "gcc (GCC) 13.3.1 20240522 (Red Hat 13.3.1-1)") 00:02:13.009 C linker for the host machine: gcc ld.bfd 2.40-14 00:02:13.009 Host machine cpu family: x86_64 00:02:13.009 Host machine cpu: x86_64 00:02:13.009 Message: ## Building in Developer Mode ## 00:02:13.009 Program pkg-config found: YES (/usr/bin/pkg-config) 00:02:13.009 Program check-symbols.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/buildtools/check-symbols.sh) 00:02:13.009 Program options-ibverbs-static.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/buildtools/options-ibverbs-static.sh) 00:02:13.009 Program python3 found: YES (/usr/bin/python3) 00:02:13.009 Program cat found: YES (/usr/bin/cat) 00:02:13.009 config/meson.build:113: WARNING: The "machine" option is deprecated. Please use "cpu_instruction_set" instead. 00:02:13.009 Compiler for C supports arguments -march=native: YES 00:02:13.009 Checking for size of "void *" : 8 00:02:13.009 Checking for size of "void *" : 8 (cached) 00:02:13.009 Library m found: YES 00:02:13.009 Library numa found: YES 00:02:13.009 Has header "numaif.h" : YES 00:02:13.009 Library fdt found: NO 00:02:13.009 Library execinfo found: NO 00:02:13.009 Has header "execinfo.h" : YES 00:02:13.009 Found pkg-config: YES (/usr/bin/pkg-config) 1.9.5 00:02:13.009 Run-time dependency libarchive found: NO (tried pkgconfig) 00:02:13.009 Run-time dependency libbsd found: NO (tried pkgconfig) 00:02:13.009 Run-time dependency jansson found: NO (tried pkgconfig) 00:02:13.009 Run-time dependency openssl found: YES 3.1.1 00:02:13.009 Run-time dependency libpcap found: YES 1.10.4 00:02:13.009 Has header "pcap.h" with dependency libpcap: YES 00:02:13.009 Compiler for C supports arguments -Wcast-qual: YES 00:02:13.009 Compiler for C supports arguments -Wdeprecated: YES 00:02:13.009 Compiler for C supports arguments -Wformat: YES 00:02:13.009 Compiler for C supports arguments -Wformat-nonliteral: NO 00:02:13.009 Compiler for C supports arguments -Wformat-security: NO 00:02:13.009 Compiler for C supports arguments -Wmissing-declarations: YES 00:02:13.009 Compiler for C supports arguments -Wmissing-prototypes: YES 00:02:13.009 Compiler for C supports arguments -Wnested-externs: YES 00:02:13.009 Compiler for C supports arguments -Wold-style-definition: YES 00:02:13.009 Compiler for C supports arguments -Wpointer-arith: YES 00:02:13.009 Compiler for C supports arguments -Wsign-compare: YES 00:02:13.009 Compiler for C supports arguments -Wstrict-prototypes: YES 00:02:13.009 Compiler for C supports arguments -Wundef: YES 00:02:13.009 Compiler for C supports arguments -Wwrite-strings: YES 00:02:13.009 Compiler for C supports arguments -Wno-address-of-packed-member: YES 00:02:13.009 Compiler for C supports arguments -Wno-packed-not-aligned: YES 00:02:13.009 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:02:13.009 Compiler for C supports arguments -Wno-zero-length-bounds: YES 00:02:13.009 Program objdump found: YES (/usr/bin/objdump) 00:02:13.009 Compiler for C supports arguments -mavx512f: YES 00:02:13.009 Checking if "AVX512 checking" compiles: YES 00:02:13.009 Fetching value of define "__SSE4_2__" : 1 00:02:13.009 Fetching value of define "__AES__" : 1 00:02:13.009 Fetching value of define "__AVX__" : 1 00:02:13.009 Fetching value of define "__AVX2__" : 1 00:02:13.009 Fetching value of define "__AVX512BW__" : 1 00:02:13.009 Fetching value of define "__AVX512CD__" : 1 00:02:13.009 Fetching value of define "__AVX512DQ__" : 1 00:02:13.009 Fetching value of define "__AVX512F__" : 1 00:02:13.009 Fetching value of define "__AVX512VL__" : 1 00:02:13.009 Fetching value of define "__PCLMUL__" : 1 00:02:13.009 Fetching value of define "__RDRND__" : 1 00:02:13.009 Fetching value of define "__RDSEED__" : 1 00:02:13.009 Fetching value of define "__VPCLMULQDQ__" : (undefined) 00:02:13.009 Fetching value of define "__znver1__" : (undefined) 00:02:13.009 Fetching value of define "__znver2__" : (undefined) 00:02:13.010 Fetching value of define "__znver3__" : (undefined) 00:02:13.010 Fetching value of define "__znver4__" : (undefined) 00:02:13.010 Compiler for C supports arguments -Wno-format-truncation: YES 00:02:13.010 Message: lib/log: Defining dependency "log" 00:02:13.010 Message: lib/kvargs: Defining dependency "kvargs" 00:02:13.010 Message: lib/telemetry: Defining dependency "telemetry" 00:02:13.010 Checking for function "getentropy" : NO 00:02:13.010 Message: lib/eal: Defining dependency "eal" 00:02:13.010 Message: lib/ring: Defining dependency "ring" 00:02:13.010 Message: lib/rcu: Defining dependency "rcu" 00:02:13.010 Message: lib/mempool: Defining dependency "mempool" 00:02:13.010 Message: lib/mbuf: Defining dependency "mbuf" 00:02:13.010 Fetching value of define "__PCLMUL__" : 1 (cached) 00:02:13.010 Fetching value of define "__AVX512F__" : 1 (cached) 00:02:13.010 Fetching value of define "__AVX512BW__" : 1 (cached) 00:02:13.010 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:02:13.010 Fetching value of define "__AVX512VL__" : 1 (cached) 00:02:13.010 Fetching value of define "__VPCLMULQDQ__" : (undefined) (cached) 00:02:13.010 Compiler for C supports arguments -mpclmul: YES 00:02:13.010 Compiler for C supports arguments -maes: YES 00:02:13.010 Compiler for C supports arguments -mavx512f: YES (cached) 00:02:13.010 Compiler for C supports arguments -mavx512bw: YES 00:02:13.010 Compiler for C supports arguments -mavx512dq: YES 00:02:13.010 Compiler for C supports arguments -mavx512vl: YES 00:02:13.010 Compiler for C supports arguments -mvpclmulqdq: YES 00:02:13.010 Compiler for C supports arguments -mavx2: YES 00:02:13.010 Compiler for C supports arguments -mavx: YES 00:02:13.010 Message: lib/net: Defining dependency "net" 00:02:13.010 Message: lib/meter: Defining dependency "meter" 00:02:13.010 Message: lib/ethdev: Defining dependency "ethdev" 00:02:13.010 Message: lib/pci: Defining dependency "pci" 00:02:13.010 Message: lib/cmdline: Defining dependency "cmdline" 00:02:13.010 Message: lib/metrics: Defining dependency "metrics" 00:02:13.010 Message: lib/hash: Defining dependency "hash" 00:02:13.010 Message: lib/timer: Defining dependency "timer" 00:02:13.010 Fetching value of define "__AVX512F__" : 1 (cached) 00:02:13.010 Fetching value of define "__AVX512VL__" : 1 (cached) 00:02:13.010 Fetching value of define "__AVX512CD__" : 1 (cached) 00:02:13.010 Fetching value of define "__AVX512BW__" : 1 (cached) 00:02:13.010 Message: lib/acl: Defining dependency "acl" 00:02:13.010 Message: lib/bbdev: Defining dependency "bbdev" 00:02:13.010 Message: lib/bitratestats: Defining dependency "bitratestats" 00:02:13.010 Run-time dependency libelf found: YES 0.191 00:02:13.010 Message: lib/bpf: Defining dependency "bpf" 00:02:13.010 Message: lib/cfgfile: Defining dependency "cfgfile" 00:02:13.010 Message: lib/compressdev: Defining dependency "compressdev" 00:02:13.010 Message: lib/cryptodev: Defining dependency "cryptodev" 00:02:13.010 Message: lib/distributor: Defining dependency "distributor" 00:02:13.010 Message: lib/dmadev: Defining dependency "dmadev" 00:02:13.010 Message: lib/efd: Defining dependency "efd" 00:02:13.010 Message: lib/eventdev: Defining dependency "eventdev" 00:02:13.010 Message: lib/dispatcher: Defining dependency "dispatcher" 00:02:13.010 Message: lib/gpudev: Defining dependency "gpudev" 00:02:13.010 Message: lib/gro: Defining dependency "gro" 00:02:13.010 Message: lib/gso: Defining dependency "gso" 00:02:13.010 Message: lib/ip_frag: Defining dependency "ip_frag" 00:02:13.010 Message: lib/jobstats: Defining dependency "jobstats" 00:02:13.010 Message: lib/latencystats: Defining dependency "latencystats" 00:02:13.010 Message: lib/lpm: Defining dependency "lpm" 00:02:13.010 Fetching value of define "__AVX512F__" : 1 (cached) 00:02:13.010 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:02:13.010 Fetching value of define "__AVX512IFMA__" : (undefined) 00:02:13.010 Compiler for C supports arguments -mavx512f -mavx512dq -mavx512ifma: YES 00:02:13.010 Message: lib/member: Defining dependency "member" 00:02:13.010 Message: lib/pcapng: Defining dependency "pcapng" 00:02:13.010 Compiler for C supports arguments -Wno-cast-qual: YES 00:02:13.010 Message: lib/power: Defining dependency "power" 00:02:13.010 Message: lib/rawdev: Defining dependency "rawdev" 00:02:13.010 Message: lib/regexdev: Defining dependency "regexdev" 00:02:13.010 Message: lib/mldev: Defining dependency "mldev" 00:02:13.010 Message: lib/rib: Defining dependency "rib" 00:02:13.010 Message: lib/reorder: Defining dependency "reorder" 00:02:13.010 Message: lib/sched: Defining dependency "sched" 00:02:13.010 Message: lib/security: Defining dependency "security" 00:02:13.010 Message: lib/stack: Defining dependency "stack" 00:02:13.010 Has header "linux/userfaultfd.h" : YES 00:02:13.010 Has header "linux/vduse.h" : YES 00:02:13.010 Message: lib/vhost: Defining dependency "vhost" 00:02:13.010 Message: lib/ipsec: Defining dependency "ipsec" 00:02:13.010 Message: lib/pdcp: Defining dependency "pdcp" 00:02:13.010 Fetching value of define "__AVX512F__" : 1 (cached) 00:02:13.010 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:02:13.010 Fetching value of define "__AVX512BW__" : 1 (cached) 00:02:13.010 Message: lib/fib: Defining dependency "fib" 00:02:13.010 Message: lib/port: Defining dependency "port" 00:02:13.010 Message: lib/pdump: Defining dependency "pdump" 00:02:13.010 Message: lib/table: Defining dependency "table" 00:02:13.010 Message: lib/pipeline: Defining dependency "pipeline" 00:02:13.010 Message: lib/graph: Defining dependency "graph" 00:02:13.010 Message: lib/node: Defining dependency "node" 00:02:13.010 Compiler for C supports arguments -Wno-format-truncation: YES (cached) 00:02:13.948 Message: drivers/bus/pci: Defining dependency "bus_pci" 00:02:13.948 Message: drivers/bus/vdev: Defining dependency "bus_vdev" 00:02:13.948 Message: drivers/mempool/ring: Defining dependency "mempool_ring" 00:02:13.948 Compiler for C supports arguments -Wno-sign-compare: YES 00:02:13.948 Compiler for C supports arguments -Wno-unused-value: YES 00:02:13.948 Compiler for C supports arguments -Wno-format: YES 00:02:13.948 Compiler for C supports arguments -Wno-format-security: YES 00:02:13.948 Compiler for C supports arguments -Wno-format-nonliteral: YES 00:02:13.948 Compiler for C supports arguments -Wno-strict-aliasing: YES 00:02:13.948 Compiler for C supports arguments -Wno-unused-but-set-variable: YES 00:02:13.948 Compiler for C supports arguments -Wno-unused-parameter: YES 00:02:13.948 Fetching value of define "__AVX512F__" : 1 (cached) 00:02:13.948 Fetching value of define "__AVX512BW__" : 1 (cached) 00:02:13.948 Compiler for C supports arguments -mavx512f: YES (cached) 00:02:13.948 Compiler for C supports arguments -mavx512bw: YES (cached) 00:02:13.948 Compiler for C supports arguments -march=skylake-avx512: YES 00:02:13.948 Message: drivers/net/i40e: Defining dependency "net_i40e" 00:02:13.948 Has header "sys/epoll.h" : YES 00:02:13.948 Program doxygen found: YES (/usr/local/bin/doxygen) 00:02:13.948 Configuring doxy-api-html.conf using configuration 00:02:13.948 Configuring doxy-api-man.conf using configuration 00:02:13.948 Program mandb found: YES (/usr/bin/mandb) 00:02:13.948 Program sphinx-build found: NO 00:02:13.948 Configuring rte_build_config.h using configuration 00:02:13.948 Message: 00:02:13.948 ================= 00:02:13.948 Applications Enabled 00:02:13.948 ================= 00:02:13.948 00:02:13.948 apps: 00:02:13.948 dumpcap, graph, pdump, proc-info, test-acl, test-bbdev, test-cmdline, test-compress-perf, 00:02:13.948 test-crypto-perf, test-dma-perf, test-eventdev, test-fib, test-flow-perf, test-gpudev, test-mldev, test-pipeline, 00:02:13.948 test-pmd, test-regex, test-sad, test-security-perf, 00:02:13.948 00:02:13.948 Message: 00:02:13.948 ================= 00:02:13.948 Libraries Enabled 00:02:13.948 ================= 00:02:13.948 00:02:13.948 libs: 00:02:13.948 log, kvargs, telemetry, eal, ring, rcu, mempool, mbuf, 00:02:13.948 net, meter, ethdev, pci, cmdline, metrics, hash, timer, 00:02:13.948 acl, bbdev, bitratestats, bpf, cfgfile, compressdev, cryptodev, distributor, 00:02:13.948 dmadev, efd, eventdev, dispatcher, gpudev, gro, gso, ip_frag, 00:02:13.948 jobstats, latencystats, lpm, member, pcapng, power, rawdev, regexdev, 00:02:13.948 mldev, rib, reorder, sched, security, stack, vhost, ipsec, 00:02:13.948 pdcp, fib, port, pdump, table, pipeline, graph, node, 00:02:13.948 00:02:13.948 00:02:13.948 Message: 00:02:13.948 =============== 00:02:13.948 Drivers Enabled 00:02:13.948 =============== 00:02:13.948 00:02:13.948 common: 00:02:13.948 00:02:13.948 bus: 00:02:13.948 pci, vdev, 00:02:13.948 mempool: 00:02:13.948 ring, 00:02:13.948 dma: 00:02:13.948 00:02:13.948 net: 00:02:13.948 i40e, 00:02:13.948 raw: 00:02:13.948 00:02:13.948 crypto: 00:02:13.948 00:02:13.948 compress: 00:02:13.948 00:02:13.948 regex: 00:02:13.948 00:02:13.948 ml: 00:02:13.948 00:02:13.948 vdpa: 00:02:13.948 00:02:13.948 event: 00:02:13.948 00:02:13.948 baseband: 00:02:13.948 00:02:13.948 gpu: 00:02:13.948 00:02:13.948 00:02:13.948 Message: 00:02:13.948 ================= 00:02:13.948 Content Skipped 00:02:13.948 ================= 00:02:13.948 00:02:13.948 apps: 00:02:13.948 00:02:13.948 libs: 00:02:13.948 00:02:13.948 drivers: 00:02:13.948 common/cpt: not in enabled drivers build config 00:02:13.948 common/dpaax: not in enabled drivers build config 00:02:13.948 common/iavf: not in enabled drivers build config 00:02:13.948 common/idpf: not in enabled drivers build config 00:02:13.948 common/mvep: not in enabled drivers build config 00:02:13.948 common/octeontx: not in enabled drivers build config 00:02:13.948 bus/auxiliary: not in enabled drivers build config 00:02:13.948 bus/cdx: not in enabled drivers build config 00:02:13.948 bus/dpaa: not in enabled drivers build config 00:02:13.948 bus/fslmc: not in enabled drivers build config 00:02:13.948 bus/ifpga: not in enabled drivers build config 00:02:13.948 bus/platform: not in enabled drivers build config 00:02:13.948 bus/vmbus: not in enabled drivers build config 00:02:13.948 common/cnxk: not in enabled drivers build config 00:02:13.948 common/mlx5: not in enabled drivers build config 00:02:13.948 common/nfp: not in enabled drivers build config 00:02:13.948 common/qat: not in enabled drivers build config 00:02:13.948 common/sfc_efx: not in enabled drivers build config 00:02:13.949 mempool/bucket: not in enabled drivers build config 00:02:13.949 mempool/cnxk: not in enabled drivers build config 00:02:13.949 mempool/dpaa: not in enabled drivers build config 00:02:13.949 mempool/dpaa2: not in enabled drivers build config 00:02:13.949 mempool/octeontx: not in enabled drivers build config 00:02:13.949 mempool/stack: not in enabled drivers build config 00:02:13.949 dma/cnxk: not in enabled drivers build config 00:02:13.949 dma/dpaa: not in enabled drivers build config 00:02:13.949 dma/dpaa2: not in enabled drivers build config 00:02:13.949 dma/hisilicon: not in enabled drivers build config 00:02:13.949 dma/idxd: not in enabled drivers build config 00:02:13.949 dma/ioat: not in enabled drivers build config 00:02:13.949 dma/skeleton: not in enabled drivers build config 00:02:13.949 net/af_packet: not in enabled drivers build config 00:02:13.949 net/af_xdp: not in enabled drivers build config 00:02:13.949 net/ark: not in enabled drivers build config 00:02:13.949 net/atlantic: not in enabled drivers build config 00:02:13.949 net/avp: not in enabled drivers build config 00:02:13.949 net/axgbe: not in enabled drivers build config 00:02:13.949 net/bnx2x: not in enabled drivers build config 00:02:13.949 net/bnxt: not in enabled drivers build config 00:02:13.949 net/bonding: not in enabled drivers build config 00:02:13.949 net/cnxk: not in enabled drivers build config 00:02:13.949 net/cpfl: not in enabled drivers build config 00:02:13.949 net/cxgbe: not in enabled drivers build config 00:02:13.949 net/dpaa: not in enabled drivers build config 00:02:13.949 net/dpaa2: not in enabled drivers build config 00:02:13.949 net/e1000: not in enabled drivers build config 00:02:13.949 net/ena: not in enabled drivers build config 00:02:13.949 net/enetc: not in enabled drivers build config 00:02:13.949 net/enetfec: not in enabled drivers build config 00:02:13.949 net/enic: not in enabled drivers build config 00:02:13.949 net/failsafe: not in enabled drivers build config 00:02:13.949 net/fm10k: not in enabled drivers build config 00:02:13.949 net/gve: not in enabled drivers build config 00:02:13.949 net/hinic: not in enabled drivers build config 00:02:13.949 net/hns3: not in enabled drivers build config 00:02:13.949 net/iavf: not in enabled drivers build config 00:02:13.949 net/ice: not in enabled drivers build config 00:02:13.949 net/idpf: not in enabled drivers build config 00:02:13.949 net/igc: not in enabled drivers build config 00:02:13.949 net/ionic: not in enabled drivers build config 00:02:13.949 net/ipn3ke: not in enabled drivers build config 00:02:13.949 net/ixgbe: not in enabled drivers build config 00:02:13.949 net/mana: not in enabled drivers build config 00:02:13.949 net/memif: not in enabled drivers build config 00:02:13.949 net/mlx4: not in enabled drivers build config 00:02:13.949 net/mlx5: not in enabled drivers build config 00:02:13.949 net/mvneta: not in enabled drivers build config 00:02:13.949 net/mvpp2: not in enabled drivers build config 00:02:13.949 net/netvsc: not in enabled drivers build config 00:02:13.949 net/nfb: not in enabled drivers build config 00:02:13.949 net/nfp: not in enabled drivers build config 00:02:13.949 net/ngbe: not in enabled drivers build config 00:02:13.949 net/null: not in enabled drivers build config 00:02:13.949 net/octeontx: not in enabled drivers build config 00:02:13.949 net/octeon_ep: not in enabled drivers build config 00:02:13.949 net/pcap: not in enabled drivers build config 00:02:13.949 net/pfe: not in enabled drivers build config 00:02:13.949 net/qede: not in enabled drivers build config 00:02:13.949 net/ring: not in enabled drivers build config 00:02:13.949 net/sfc: not in enabled drivers build config 00:02:13.949 net/softnic: not in enabled drivers build config 00:02:13.949 net/tap: not in enabled drivers build config 00:02:13.949 net/thunderx: not in enabled drivers build config 00:02:13.949 net/txgbe: not in enabled drivers build config 00:02:13.949 net/vdev_netvsc: not in enabled drivers build config 00:02:13.949 net/vhost: not in enabled drivers build config 00:02:13.949 net/virtio: not in enabled drivers build config 00:02:13.949 net/vmxnet3: not in enabled drivers build config 00:02:13.949 raw/cnxk_bphy: not in enabled drivers build config 00:02:13.949 raw/cnxk_gpio: not in enabled drivers build config 00:02:13.949 raw/dpaa2_cmdif: not in enabled drivers build config 00:02:13.949 raw/ifpga: not in enabled drivers build config 00:02:13.949 raw/ntb: not in enabled drivers build config 00:02:13.949 raw/skeleton: not in enabled drivers build config 00:02:13.949 crypto/armv8: not in enabled drivers build config 00:02:13.949 crypto/bcmfs: not in enabled drivers build config 00:02:13.949 crypto/caam_jr: not in enabled drivers build config 00:02:13.949 crypto/ccp: not in enabled drivers build config 00:02:13.949 crypto/cnxk: not in enabled drivers build config 00:02:13.949 crypto/dpaa_sec: not in enabled drivers build config 00:02:13.949 crypto/dpaa2_sec: not in enabled drivers build config 00:02:13.949 crypto/ipsec_mb: not in enabled drivers build config 00:02:13.949 crypto/mlx5: not in enabled drivers build config 00:02:13.949 crypto/mvsam: not in enabled drivers build config 00:02:13.949 crypto/nitrox: not in enabled drivers build config 00:02:13.949 crypto/null: not in enabled drivers build config 00:02:13.949 crypto/octeontx: not in enabled drivers build config 00:02:13.949 crypto/openssl: not in enabled drivers build config 00:02:13.949 crypto/scheduler: not in enabled drivers build config 00:02:13.949 crypto/uadk: not in enabled drivers build config 00:02:13.949 crypto/virtio: not in enabled drivers build config 00:02:13.949 compress/isal: not in enabled drivers build config 00:02:13.949 compress/mlx5: not in enabled drivers build config 00:02:13.949 compress/octeontx: not in enabled drivers build config 00:02:13.949 compress/zlib: not in enabled drivers build config 00:02:13.949 regex/mlx5: not in enabled drivers build config 00:02:13.949 regex/cn9k: not in enabled drivers build config 00:02:13.949 ml/cnxk: not in enabled drivers build config 00:02:13.949 vdpa/ifc: not in enabled drivers build config 00:02:13.949 vdpa/mlx5: not in enabled drivers build config 00:02:13.949 vdpa/nfp: not in enabled drivers build config 00:02:13.949 vdpa/sfc: not in enabled drivers build config 00:02:13.949 event/cnxk: not in enabled drivers build config 00:02:13.949 event/dlb2: not in enabled drivers build config 00:02:13.949 event/dpaa: not in enabled drivers build config 00:02:13.949 event/dpaa2: not in enabled drivers build config 00:02:13.949 event/dsw: not in enabled drivers build config 00:02:13.949 event/opdl: not in enabled drivers build config 00:02:13.949 event/skeleton: not in enabled drivers build config 00:02:13.949 event/sw: not in enabled drivers build config 00:02:13.949 event/octeontx: not in enabled drivers build config 00:02:13.949 baseband/acc: not in enabled drivers build config 00:02:13.949 baseband/fpga_5gnr_fec: not in enabled drivers build config 00:02:13.949 baseband/fpga_lte_fec: not in enabled drivers build config 00:02:13.949 baseband/la12xx: not in enabled drivers build config 00:02:13.949 baseband/null: not in enabled drivers build config 00:02:13.949 baseband/turbo_sw: not in enabled drivers build config 00:02:13.949 gpu/cuda: not in enabled drivers build config 00:02:13.949 00:02:13.949 00:02:13.949 Build targets in project: 217 00:02:13.949 00:02:13.949 DPDK 23.11.0 00:02:13.949 00:02:13.949 User defined options 00:02:13.949 libdir : lib 00:02:13.949 prefix : /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:02:13.949 c_args : -fPIC -g -fcommon -Werror -Wno-stringop-overflow 00:02:13.949 c_link_args : 00:02:13.949 enable_docs : false 00:02:13.949 enable_drivers: bus,bus/pci,bus/vdev,mempool/ring,net/i40e,net/i40e/base, 00:02:13.949 enable_kmods : false 00:02:13.949 machine : native 00:02:13.949 tests : false 00:02:13.949 00:02:13.949 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:02:13.949 WARNING: Running the setup command as `meson [options]` instead of `meson setup [options]` is ambiguous and deprecated. 00:02:13.949 14:15:10 build_native_dpdk -- common/autobuild_common.sh@192 -- $ ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp -j112 00:02:14.219 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp' 00:02:14.219 [1/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hypervisor.c.o 00:02:14.219 [2/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_errno.c.o 00:02:14.219 [3/707] Compiling C object lib/librte_eal.a.p/eal_common_rte_version.c.o 00:02:14.481 [4/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_class.c.o 00:02:14.481 [5/707] Compiling C object lib/librte_log.a.p/log_log_linux.c.o 00:02:14.481 [6/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_cpuflags.c.o 00:02:14.481 [7/707] Compiling C object lib/librte_eal.a.p/eal_x86_rte_spinlock.c.o 00:02:14.481 [8/707] Compiling C object lib/librte_eal.a.p/eal_x86_rte_hypervisor.c.o 00:02:14.481 [9/707] Compiling C object lib/librte_eal.a.p/eal_common_rte_reciprocal.c.o 00:02:14.481 [10/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hexdump.c.o 00:02:14.481 [11/707] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cpuflags.c.o 00:02:14.481 [12/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_debug.c.o 00:02:14.481 [13/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_cpuflags.c.o 00:02:14.481 [14/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_uuid.c.o 00:02:14.481 [15/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_thread.c.o 00:02:14.481 [16/707] Compiling C object lib/librte_kvargs.a.p/kvargs_rte_kvargs.c.o 00:02:14.481 [17/707] Compiling C object lib/librte_eal.a.p/eal_unix_eal_firmware.c.o 00:02:14.481 [18/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_string_fns.c.o 00:02:14.481 [19/707] Linking static target lib/librte_kvargs.a 00:02:14.481 [20/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio_mp_sync.c.o 00:02:14.481 [21/707] Compiling C object lib/librte_eal.a.p/eal_unix_eal_debug.c.o 00:02:14.481 [22/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_vt100.c.o 00:02:14.481 [23/707] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_data.c.o 00:02:14.481 [24/707] Compiling C object lib/librte_pci.a.p/pci_rte_pci.c.o 00:02:14.481 [25/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_num.c.o 00:02:14.481 [26/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_portlist.c.o 00:02:14.481 [27/707] Linking static target lib/librte_pci.a 00:02:14.481 [28/707] Compiling C object lib/librte_eal.a.p/eal_unix_rte_thread.c.o 00:02:14.481 [29/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline.c.o 00:02:14.481 [30/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_socket.c.o 00:02:14.481 [31/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_os_unix.c.o 00:02:14.481 [32/707] Compiling C object lib/librte_log.a.p/log_log.c.o 00:02:14.481 [33/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_string.c.o 00:02:14.481 [34/707] Linking static target lib/librte_log.a 00:02:14.748 [35/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_cirbuf.c.o 00:02:14.748 [36/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse.c.o 00:02:14.748 [37/707] Generating lib/pci.sym_chk with a custom command (wrapped by meson to capture output) 00:02:14.748 [38/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_rdline.c.o 00:02:14.748 [39/707] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_timer.c.o 00:02:15.010 [40/707] Generating lib/kvargs.sym_chk with a custom command (wrapped by meson to capture output) 00:02:15.010 [41/707] Compiling C object lib/librte_eal.a.p/eal_common_rte_keepalive.c.o 00:02:15.010 [42/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_config.c.o 00:02:15.010 [43/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_timer.c.o 00:02:15.010 [44/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_launch.c.o 00:02:15.010 [45/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_timer.c.o 00:02:15.010 [46/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_lcore.c.o 00:02:15.010 [47/707] Compiling C object lib/librte_eal.a.p/eal_unix_eal_file.c.o 00:02:15.010 [48/707] Compiling C object lib/librte_eal.a.p/eal_x86_rte_power_intrinsics.c.o 00:02:15.010 [49/707] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_thread.c.o 00:02:15.010 [50/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_tailqs.c.o 00:02:15.010 [51/707] Compiling C object lib/librte_eal.a.p/eal_unix_eal_filesystem.c.o 00:02:15.010 [52/707] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cycles.c.o 00:02:15.010 [53/707] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_memory.c.o 00:02:15.010 [54/707] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_legacy.c.o 00:02:15.010 [55/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_mcfg.c.o 00:02:15.010 [56/707] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_pool_ops.c.o 00:02:15.010 [57/707] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops_default.c.o 00:02:15.010 [58/707] Compiling C object lib/librte_net.a.p/net_net_crc_sse.c.o 00:02:15.010 [59/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_bus.c.o 00:02:15.010 [60/707] Compiling C object lib/librte_eal.a.p/eal_common_rte_random.c.o 00:02:15.010 [61/707] Compiling C object lib/librte_meter.a.p/meter_rte_meter.c.o 00:02:15.010 [62/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_etheraddr.c.o 00:02:15.010 [63/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_ctf.c.o 00:02:15.010 [64/707] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_ptype.c.o 00:02:15.010 [65/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_ipaddr.c.o 00:02:15.010 [66/707] Linking static target lib/librte_meter.a 00:02:15.010 [67/707] Compiling C object lib/librte_net.a.p/net_rte_net_crc.c.o 00:02:15.010 [68/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memalloc.c.o 00:02:15.010 [69/707] Linking static target lib/librte_cmdline.a 00:02:15.010 [70/707] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_profile.c.o 00:02:15.010 [71/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_interrupts.c.o 00:02:15.010 [72/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_dev.c.o 00:02:15.010 [73/707] Compiling C object lib/librte_ring.a.p/ring_rte_ring.c.o 00:02:15.010 [74/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace.c.o 00:02:15.010 [75/707] Compiling C object lib/librte_acl.a.p/acl_tb_mem.c.o 00:02:15.010 [76/707] Compiling C object lib/librte_eal.a.p/eal_common_hotplug_mp.c.o 00:02:15.010 [77/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dynmem.c.o 00:02:15.010 [78/707] Linking static target lib/librte_ring.a 00:02:15.010 [79/707] Compiling C object lib/librte_mempool.a.p/mempool_mempool_trace_points.c.o 00:02:15.010 [80/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_thread.c.o 00:02:15.010 [81/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_points.c.o 00:02:15.010 [82/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_utils.c.o 00:02:15.010 [83/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dev.c.o 00:02:15.010 [84/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_devargs.c.o 00:02:15.010 [85/707] Compiling C object lib/librte_hash.a.p/hash_rte_fbk_hash.c.o 00:02:15.010 [86/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_alarm.c.o 00:02:15.010 [87/707] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops.c.o 00:02:15.010 [88/707] Compiling C object lib/librte_metrics.a.p/metrics_rte_metrics.c.o 00:02:15.010 [89/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memzone.c.o 00:02:15.277 [90/707] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_class_eth.c.o 00:02:15.277 [91/707] Compiling C object lib/librte_eal.a.p/eal_common_malloc_mp.c.o 00:02:15.277 [92/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf.c.o 00:02:15.277 [93/707] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_telemetry.c.o 00:02:15.277 [94/707] Compiling C object lib/librte_net.a.p/net_rte_ether.c.o 00:02:15.277 [95/707] Compiling C object lib/librte_metrics.a.p/metrics_rte_metrics_telemetry.c.o 00:02:15.277 [96/707] Compiling C object lib/librte_eal.a.p/eal_common_malloc_elem.c.o 00:02:15.277 [97/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_lcore.c.o 00:02:15.277 [98/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf_stub.c.o 00:02:15.277 [99/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_hugepage_info.c.o 00:02:15.277 [100/707] Compiling C object lib/net/libnet_crc_avx512_lib.a.p/net_crc_avx512.c.o 00:02:15.277 [101/707] Linking static target lib/librte_metrics.a 00:02:15.277 [102/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf_dump.c.o 00:02:15.277 [103/707] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8472.c.o 00:02:15.277 [104/707] Linking static target lib/net/libnet_crc_avx512_lib.a 00:02:15.277 [105/707] Compiling C object lib/librte_acl.a.p/acl_rte_acl.c.o 00:02:15.277 [106/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf_load.c.o 00:02:15.277 [107/707] Compiling C object lib/librte_bitratestats.a.p/bitratestats_rte_bitrate.c.o 00:02:15.277 [108/707] Compiling C object lib/librte_net.a.p/net_rte_arp.c.o 00:02:15.277 [109/707] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_dyn.c.o 00:02:15.277 [110/707] Linking static target lib/librte_bitratestats.a 00:02:15.277 [111/707] Compiling C object lib/librte_net.a.p/net_rte_net.c.o 00:02:15.277 [112/707] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_cman.c.o 00:02:15.277 [113/707] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_private.c.o 00:02:15.277 [114/707] Linking static target lib/librte_net.a 00:02:15.277 [115/707] Compiling C object lib/librte_cfgfile.a.p/cfgfile_rte_cfgfile.c.o 00:02:15.277 [116/707] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8079.c.o 00:02:15.277 [117/707] Linking static target lib/librte_cfgfile.a 00:02:15.277 [118/707] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_driver.c.o 00:02:15.277 [119/707] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_common.c.o 00:02:15.277 [120/707] Compiling C object lib/librte_eal.a.p/eal_common_rte_malloc.c.o 00:02:15.277 [121/707] Compiling C object lib/librte_power.a.p/power_guest_channel.c.o 00:02:15.277 [122/707] Compiling C object lib/librte_power.a.p/power_power_common.c.o 00:02:15.277 [123/707] Compiling C object lib/librte_power.a.p/power_power_kvm_vm.c.o 00:02:15.543 [124/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_proc.c.o 00:02:15.543 [125/707] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8636.c.o 00:02:15.543 [126/707] Generating lib/log.sym_chk with a custom command (wrapped by meson to capture output) 00:02:15.543 [127/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal.c.o 00:02:15.543 [128/707] Compiling C object lib/librte_eal.a.p/eal_common_malloc_heap.c.o 00:02:15.543 [129/707] Compiling C object lib/librte_acl.a.p/acl_acl_gen.c.o 00:02:15.543 [130/707] Generating lib/meter.sym_chk with a custom command (wrapped by meson to capture output) 00:02:15.543 [131/707] Linking target lib/librte_log.so.24.0 00:02:15.543 [132/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_fbarray.c.o 00:02:15.543 [133/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memory.c.o 00:02:15.543 [134/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_interrupts.c.o 00:02:15.543 [135/707] Generating lib/ring.sym_chk with a custom command (wrapped by meson to capture output) 00:02:15.543 [136/707] Compiling C object lib/librte_timer.a.p/timer_rte_timer.c.o 00:02:15.543 [137/707] Linking static target lib/librte_timer.a 00:02:15.543 [138/707] Compiling C object lib/librte_acl.a.p/acl_acl_run_scalar.c.o 00:02:15.543 [139/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memalloc.c.o 00:02:15.543 [140/707] Compiling C object lib/librte_distributor.a.p/distributor_rte_distributor_match_sse.c.o 00:02:15.543 [141/707] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev_pmd.c.o 00:02:15.543 [142/707] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev_trace_points.c.o 00:02:15.543 [143/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf_exec.c.o 00:02:15.543 [144/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf_load_elf.c.o 00:02:15.543 [145/707] Generating lib/bitratestats.sym_chk with a custom command (wrapped by meson to capture output) 00:02:15.543 [146/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memory.c.o 00:02:15.543 [147/707] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_pmd.c.o 00:02:15.543 [148/707] Compiling C object lib/librte_bbdev.a.p/bbdev_rte_bbdev.c.o 00:02:15.543 [149/707] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool.c.o 00:02:15.543 [150/707] Compiling C object lib/librte_eal.a.p/eal_common_rte_service.c.o 00:02:15.543 [151/707] Linking static target lib/librte_bbdev.a 00:02:15.543 [152/707] Compiling C object lib/librte_sched.a.p/sched_rte_approx.c.o 00:02:15.543 [153/707] Linking static target lib/librte_mempool.a 00:02:15.543 [154/707] Compiling C object lib/librte_eventdev.a.p/eventdev_eventdev_private.c.o 00:02:15.807 [155/707] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_mtr.c.o 00:02:15.807 [156/707] Generating symbol file lib/librte_log.so.24.0.p/librte_log.so.24.0.symbols 00:02:15.807 [157/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio.c.o 00:02:15.807 [158/707] Compiling C object lib/librte_hash.a.p/hash_rte_thash.c.o 00:02:15.807 [159/707] Generating lib/net.sym_chk with a custom command (wrapped by meson to capture output) 00:02:15.807 [160/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf_convert.c.o 00:02:15.807 [161/707] Linking target lib/librte_kvargs.so.24.0 00:02:15.808 [162/707] Compiling C object lib/librte_distributor.a.p/distributor_rte_distributor_single.c.o 00:02:15.808 [163/707] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_comp.c.o 00:02:15.808 [164/707] Compiling C object lib/librte_jobstats.a.p/jobstats_rte_jobstats.c.o 00:02:15.808 [165/707] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_telemetry.c.o 00:02:15.808 [166/707] Compiling C object lib/librte_gso.a.p/gso_gso_udp4.c.o 00:02:15.808 [167/707] Compiling C object lib/librte_vhost.a.p/vhost_fd_man.c.o 00:02:15.808 [168/707] Linking static target lib/librte_jobstats.a 00:02:15.808 [169/707] Compiling C object lib/librte_gso.a.p/gso_gso_tcp4.c.o 00:02:15.808 [170/707] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_ring.c.o 00:02:15.808 [171/707] Compiling C object lib/librte_gso.a.p/gso_gso_tunnel_udp4.c.o 00:02:15.808 [172/707] Generating lib/metrics.sym_chk with a custom command (wrapped by meson to capture output) 00:02:15.808 [173/707] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev.c.o 00:02:15.808 [174/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf_pkt.c.o 00:02:15.808 [175/707] Linking static target lib/librte_compressdev.a 00:02:15.808 [176/707] Generating lib/cfgfile.sym_chk with a custom command (wrapped by meson to capture output) 00:02:15.808 [177/707] Compiling C object lib/librte_power.a.p/power_rte_power_uncore.c.o 00:02:15.808 [178/707] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_trace_points.c.o 00:02:15.808 [179/707] Compiling C object lib/librte_gso.a.p/gso_rte_gso.c.o 00:02:15.808 [180/707] Compiling C object lib/librte_gso.a.p/gso_gso_tunnel_tcp4.c.o 00:02:15.808 [181/707] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv4_reassembly.c.o 00:02:15.808 [182/707] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv6_reassembly.c.o 00:02:15.808 [183/707] Compiling C object lib/librte_member.a.p/member_rte_member.c.o 00:02:15.808 [184/707] Compiling C object lib/librte_power.a.p/power_rte_power.c.o 00:02:15.808 [185/707] Compiling C object lib/librte_mldev.a.p/mldev_rte_mldev_pmd.c.o 00:02:15.808 [186/707] Compiling C object lib/librte_mldev.a.p/mldev_mldev_utils.c.o 00:02:16.071 [187/707] Compiling C object lib/librte_gro.a.p/gro_gro_tcp6.c.o 00:02:16.071 [188/707] Generating symbol file lib/librte_kvargs.so.24.0.p/librte_kvargs.so.24.0.symbols 00:02:16.071 [189/707] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_tm.c.o 00:02:16.071 [190/707] Compiling C object lib/librte_power.a.p/power_power_acpi_cpufreq.c.o 00:02:16.071 [191/707] Compiling C object lib/librte_gro.a.p/gro_gro_tcp4.c.o 00:02:16.071 [192/707] Compiling C object lib/member/libsketch_avx512_tmp.a.p/rte_member_sketch_avx512.c.o 00:02:16.071 [193/707] Compiling C object lib/librte_latencystats.a.p/latencystats_rte_latencystats.c.o 00:02:16.071 [194/707] Linking static target lib/member/libsketch_avx512_tmp.a 00:02:16.071 [195/707] Compiling C object lib/librte_dispatcher.a.p/dispatcher_rte_dispatcher.c.o 00:02:16.071 [196/707] Linking static target lib/librte_latencystats.a 00:02:16.071 [197/707] Compiling C object lib/librte_power.a.p/power_power_intel_uncore.c.o 00:02:16.071 [198/707] Linking static target lib/librte_dispatcher.a 00:02:16.071 [199/707] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry.c.o 00:02:16.071 [200/707] Compiling C object lib/librte_gro.a.p/gro_gro_udp4.c.o 00:02:16.071 [201/707] Compiling C object lib/librte_mldev.a.p/mldev_mldev_utils_scalar_bfloat16.c.o 00:02:16.071 [202/707] Compiling C object lib/librte_sched.a.p/sched_rte_pie.c.o 00:02:16.071 [203/707] Linking static target lib/librte_telemetry.a 00:02:16.071 [204/707] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev.c.o 00:02:16.071 [205/707] Compiling C object lib/librte_sched.a.p/sched_rte_red.c.o 00:02:16.071 [206/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_options.c.o 00:02:16.071 [207/707] Generating lib/timer.sym_chk with a custom command (wrapped by meson to capture output) 00:02:16.071 [208/707] Compiling C object lib/librte_gpudev.a.p/gpudev_gpudev.c.o 00:02:16.071 [209/707] Compiling C object lib/librte_gro.a.p/gro_rte_gro.c.o 00:02:16.071 [210/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf_validate.c.o 00:02:16.071 [211/707] Linking static target lib/librte_dmadev.a 00:02:16.071 [212/707] Compiling C object lib/librte_power.a.p/power_power_amd_pstate_cpufreq.c.o 00:02:16.071 [213/707] Compiling C object lib/librte_power.a.p/power_power_cppc_cpufreq.c.o 00:02:16.071 [214/707] Linking static target lib/librte_gpudev.a 00:02:16.071 [215/707] Compiling C object lib/librte_rcu.a.p/rcu_rte_rcu_qsbr.c.o 00:02:16.071 [216/707] Compiling C object lib/librte_gro.a.p/gro_gro_vxlan_udp4.c.o 00:02:16.071 [217/707] Compiling C object lib/librte_stack.a.p/stack_rte_stack_lf.c.o 00:02:16.071 [218/707] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ip_frag_common.c.o 00:02:16.071 [219/707] Compiling C object lib/librte_stack.a.p/stack_rte_stack_std.c.o 00:02:16.071 [220/707] Compiling C object lib/librte_stack.a.p/stack_rte_stack.c.o 00:02:16.071 [221/707] Linking static target lib/librte_rcu.a 00:02:16.071 [222/707] Compiling C object lib/librte_gro.a.p/gro_gro_vxlan_tcp4.c.o 00:02:16.071 [223/707] Compiling C object lib/librte_eventdev.a.p/eventdev_eventdev_trace_points.c.o 00:02:16.071 [224/707] Linking static target lib/librte_stack.a 00:02:16.071 [225/707] Linking static target lib/librte_eal.a 00:02:16.071 [226/707] Linking static target lib/librte_gro.a 00:02:16.071 [227/707] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv6_fragmentation.c.o 00:02:16.071 [228/707] Compiling C object lib/librte_ip_frag.a.p/ip_frag_ip_frag_internal.c.o 00:02:16.071 [229/707] Compiling C object lib/librte_distributor.a.p/distributor_rte_distributor.c.o 00:02:16.071 [230/707] Compiling C object lib/librte_lpm.a.p/lpm_rte_lpm.c.o 00:02:16.071 [231/707] Compiling C object lib/librte_regexdev.a.p/regexdev_rte_regexdev.c.o 00:02:16.071 [232/707] Compiling C object lib/librte_power.a.p/power_rte_power_pmd_mgmt.c.o 00:02:16.071 [233/707] Compiling C object lib/librte_gso.a.p/gso_gso_common.c.o 00:02:16.071 [234/707] Linking static target lib/librte_distributor.a 00:02:16.071 [235/707] Linking static target lib/librte_regexdev.a 00:02:16.071 [236/707] Linking static target lib/librte_gso.a 00:02:16.334 [237/707] Compiling C object lib/librte_rawdev.a.p/rawdev_rte_rawdev.c.o 00:02:16.334 [238/707] Compiling C object lib/librte_mldev.a.p/mldev_rte_mldev.c.o 00:02:16.334 [239/707] Compiling C object lib/librte_mldev.a.p/mldev_mldev_utils_scalar.c.o 00:02:16.334 [240/707] Linking static target lib/librte_rawdev.a 00:02:16.334 [241/707] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf.c.o 00:02:16.334 [242/707] Linking static target lib/librte_mldev.a 00:02:16.334 [243/707] Compiling C object lib/librte_table.a.p/table_rte_swx_keycmp.c.o 00:02:16.334 [244/707] Compiling C object lib/librte_power.a.p/power_power_pstate_cpufreq.c.o 00:02:16.334 [245/707] Linking static target lib/librte_mbuf.a 00:02:16.334 [246/707] Linking static target lib/librte_power.a 00:02:16.334 [247/707] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv4_fragmentation.c.o 00:02:16.334 [248/707] Generating lib/jobstats.sym_chk with a custom command (wrapped by meson to capture output) 00:02:16.334 [249/707] Linking static target lib/librte_ip_frag.a 00:02:16.334 [250/707] Compiling C object lib/librte_pcapng.a.p/pcapng_rte_pcapng.c.o 00:02:16.334 [251/707] Linking static target lib/librte_pcapng.a 00:02:16.334 [252/707] Compiling C object lib/librte_rib.a.p/rib_rte_rib.c.o 00:02:16.334 [253/707] Compiling C object lib/librte_ipsec.a.p/ipsec_ses.c.o 00:02:16.334 [254/707] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_dma_adapter.c.o 00:02:16.334 [255/707] Compiling C object lib/librte_reorder.a.p/reorder_rte_reorder.c.o 00:02:16.334 [256/707] Linking static target lib/librte_reorder.a 00:02:16.334 [257/707] Generating lib/latencystats.sym_chk with a custom command (wrapped by meson to capture output) 00:02:16.334 [258/707] Compiling C object lib/librte_pdcp.a.p/pdcp_pdcp_reorder.c.o 00:02:16.334 [259/707] Compiling C object lib/librte_vhost.a.p/vhost_vdpa.c.o 00:02:16.334 [260/707] Generating lib/cmdline.sym_chk with a custom command (wrapped by meson to capture output) 00:02:16.597 [261/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf_jit_x86.c.o 00:02:16.597 [262/707] Compiling C object lib/librte_fib.a.p/fib_rte_fib.c.o 00:02:16.597 [263/707] Linking static target lib/librte_bpf.a 00:02:16.597 [264/707] Compiling C object lib/librte_fib.a.p/fib_rte_fib6.c.o 00:02:16.597 [265/707] Compiling C object lib/librte_vhost.a.p/vhost_iotlb.c.o 00:02:16.597 [266/707] Compiling C object lib/librte_security.a.p/security_rte_security.c.o 00:02:16.597 [267/707] Compiling C object lib/librte_pdcp.a.p/pdcp_pdcp_ctrl_pdu.c.o 00:02:16.597 [268/707] Generating lib/stack.sym_chk with a custom command (wrapped by meson to capture output) 00:02:16.597 [269/707] Linking static target lib/librte_security.a 00:02:16.597 [270/707] Compiling C object lib/librte_pdcp.a.p/pdcp_pdcp_crypto.c.o 00:02:16.597 [271/707] Compiling C object lib/librte_ipsec.a.p/ipsec_ipsec_telemetry.c.o 00:02:16.597 [272/707] Generating lib/gso.sym_chk with a custom command (wrapped by meson to capture output) 00:02:16.597 [273/707] Generating lib/gro.sym_chk with a custom command (wrapped by meson to capture output) 00:02:16.597 [274/707] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net_ctrl.c.o 00:02:16.597 [275/707] Compiling C object lib/librte_member.a.p/member_rte_member_vbf.c.o 00:02:16.597 [276/707] Generating lib/rcu.sym_chk with a custom command (wrapped by meson to capture output) 00:02:16.597 [277/707] Compiling C object lib/librte_port.a.p/port_rte_port_sched.c.o 00:02:16.597 [278/707] Compiling C object lib/librte_vhost.a.p/vhost_vduse.c.o 00:02:16.597 [279/707] Compiling C object lib/librte_pdcp.a.p/pdcp_pdcp_cnt.c.o 00:02:16.597 [280/707] Compiling C object lib/librte_fib.a.p/fib_trie_avx512.c.o 00:02:16.597 [281/707] Compiling C object lib/librte_fib.a.p/fib_dir24_8_avx512.c.o 00:02:16.597 [282/707] Generating lib/distributor.sym_chk with a custom command (wrapped by meson to capture output) 00:02:16.597 [283/707] Generating lib/compressdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:16.597 [284/707] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_trace_points.c.o 00:02:16.861 [285/707] Compiling C object lib/librte_ipsec.a.p/ipsec_sa.c.o 00:02:16.861 [286/707] Compiling C object lib/librte_node.a.p/node_null.c.o 00:02:16.861 [287/707] Compiling C object lib/librte_member.a.p/member_rte_member_ht.c.o 00:02:16.861 [288/707] Compiling C object lib/librte_lpm.a.p/lpm_rte_lpm6.c.o 00:02:16.861 [289/707] Generating lib/mempool.sym_chk with a custom command (wrapped by meson to capture output) 00:02:16.861 [290/707] Linking static target lib/librte_lpm.a 00:02:16.861 [291/707] Compiling C object lib/librte_pdcp.a.p/pdcp_rte_pdcp.c.o 00:02:16.861 [292/707] Compiling C object lib/librte_rib.a.p/rib_rte_rib6.c.o 00:02:16.861 [293/707] Compiling C object lib/librte_vhost.a.p/vhost_socket.c.o 00:02:16.861 [294/707] Generating lib/bbdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:16.861 [295/707] Generating lib/dmadev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:16.861 [296/707] Linking static target lib/librte_rib.a 00:02:16.861 [297/707] Generating lib/pcapng.sym_chk with a custom command (wrapped by meson to capture output) 00:02:16.861 [298/707] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_params.c.o 00:02:16.861 [299/707] Generating lib/telemetry.sym_chk with a custom command (wrapped by meson to capture output) 00:02:16.861 [300/707] Generating lib/ip_frag.sym_chk with a custom command (wrapped by meson to capture output) 00:02:16.861 [301/707] Compiling C object lib/librte_table.a.p/table_rte_table_array.c.o 00:02:16.861 [302/707] Compiling C object lib/librte_table.a.p/table_rte_table_stub.c.o 00:02:16.861 [303/707] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev_params.c.o 00:02:16.861 [304/707] Generating lib/dispatcher.sym_chk with a custom command (wrapped by meson to capture output) 00:02:16.861 [305/707] Compiling C object lib/librte_table.a.p/table_rte_swx_table_wm.c.o 00:02:16.861 [306/707] Compiling C object lib/librte_table.a.p/table_rte_swx_table_selector.c.o 00:02:16.861 [307/707] Compiling C object lib/librte_table.a.p/table_rte_table_hash_cuckoo.c.o 00:02:16.861 [308/707] Compiling C object lib/librte_table.a.p/table_rte_table_lpm_ipv6.c.o 00:02:16.861 [309/707] Linking target lib/librte_telemetry.so.24.0 00:02:16.861 [310/707] Compiling C object lib/librte_table.a.p/table_rte_swx_table_learner.c.o 00:02:16.861 [311/707] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_timer_adapter.c.o 00:02:16.861 [312/707] Compiling C object lib/librte_table.a.p/table_rte_table_lpm.c.o 00:02:16.861 [313/707] Compiling C object lib/librte_acl.a.p/acl_acl_bld.c.o 00:02:16.861 [314/707] Compiling C object lib/librte_port.a.p/port_rte_port_frag.c.o 00:02:16.861 [315/707] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_crypto_adapter.c.o 00:02:16.861 [316/707] Compiling C object lib/librte_port.a.p/port_rte_port_fd.c.o 00:02:17.121 [317/707] Generating lib/reorder.sym_chk with a custom command (wrapped by meson to capture output) 00:02:17.121 [318/707] Compiling C object lib/librte_port.a.p/port_rte_port_ras.c.o 00:02:17.121 [319/707] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_eth_tx_adapter.c.o 00:02:17.121 [320/707] Compiling C object lib/librte_port.a.p/port_rte_swx_port_ethdev.c.o 00:02:17.121 [321/707] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_port_in_action.c.o 00:02:17.121 [322/707] Generating lib/bpf.sym_chk with a custom command (wrapped by meson to capture output) 00:02:17.121 [323/707] Compiling C object lib/librte_port.a.p/port_rte_swx_port_fd.c.o 00:02:17.121 [324/707] Generating lib/rawdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:17.121 [325/707] Compiling C object lib/librte_table.a.p/table_rte_swx_table_em.c.o 00:02:17.121 [326/707] Compiling C object lib/librte_table.a.p/table_rte_table_acl.c.o 00:02:17.121 [327/707] Compiling C object lib/librte_graph.a.p/graph_graph_debug.c.o 00:02:17.121 [328/707] Compiling C object lib/librte_graph.a.p/graph_graph_ops.c.o 00:02:17.121 [329/707] Compiling C object lib/librte_efd.a.p/efd_rte_efd.c.o 00:02:17.121 [330/707] Compiling C object lib/librte_port.a.p/port_rte_port_ethdev.c.o 00:02:17.121 [331/707] Linking static target lib/librte_efd.a 00:02:17.121 [332/707] Generating symbol file lib/librte_telemetry.so.24.0.p/librte_telemetry.so.24.0.symbols 00:02:17.121 [333/707] Compiling C object lib/librte_port.a.p/port_rte_port_sym_crypto.c.o 00:02:17.121 [334/707] Compiling C object lib/librte_graph.a.p/graph_rte_graph_worker.c.o 00:02:17.121 [335/707] Compiling C object lib/librte_graph.a.p/graph_node.c.o 00:02:17.121 [336/707] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_flow.c.o 00:02:17.121 [337/707] Compiling C object lib/librte_fib.a.p/fib_trie.c.o 00:02:17.121 [338/707] Compiling C object lib/librte_graph.a.p/graph_graph_populate.c.o 00:02:17.121 [339/707] Compiling C object lib/librte_graph.a.p/graph_graph_pcap.c.o 00:02:17.122 [340/707] Compiling C object lib/librte_port.a.p/port_rte_swx_port_source_sink.c.o 00:02:17.382 [341/707] Generating lib/security.sym_chk with a custom command (wrapped by meson to capture output) 00:02:17.382 [342/707] Compiling C object lib/librte_port.a.p/port_rte_port_eventdev.c.o 00:02:17.382 [343/707] Generating lib/mbuf.sym_chk with a custom command (wrapped by meson to capture output) 00:02:17.382 [344/707] Compiling C object lib/librte_port.a.p/port_rte_port_source_sink.c.o 00:02:17.382 [345/707] Compiling C object lib/librte_node.a.p/node_pkt_drop.c.o 00:02:17.382 [346/707] Compiling C object lib/librte_node.a.p/node_ethdev_ctrl.c.o 00:02:17.382 [347/707] Compiling C object app/dpdk-test-cmdline.p/test-cmdline_cmdline_test.c.o 00:02:17.382 [348/707] Compiling C object app/dpdk-test-cmdline.p/test-cmdline_commands.c.o 00:02:17.382 [349/707] Compiling C object lib/librte_node.a.p/node_log.c.o 00:02:17.382 [350/707] Compiling C object lib/librte_graph.a.p/graph_graph_stats.c.o 00:02:17.382 [351/707] Generating lib/power.sym_chk with a custom command (wrapped by meson to capture output) 00:02:17.382 [352/707] Compiling C object lib/librte_graph.a.p/graph_graph.c.o 00:02:17.382 [353/707] Compiling C object lib/librte_ipsec.a.p/ipsec_ipsec_sad.c.o 00:02:17.382 [354/707] Generating lib/lpm.sym_chk with a custom command (wrapped by meson to capture output) 00:02:17.382 [355/707] Compiling C object lib/librte_node.a.p/node_ethdev_tx.c.o 00:02:17.383 [356/707] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_eventdev.c.o 00:02:17.383 [357/707] Compiling C object lib/librte_port.a.p/port_rte_swx_port_ring.c.o 00:02:17.383 [358/707] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common_uio.c.o 00:02:17.383 [359/707] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_pipeline.c.o 00:02:17.383 [360/707] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_diag.c.o 00:02:17.383 [361/707] Compiling C object lib/librte_fib.a.p/fib_dir24_8.c.o 00:02:17.647 [362/707] Compiling C object lib/librte_node.a.p/node_kernel_tx.c.o 00:02:17.647 [363/707] Linking static target lib/librte_fib.a 00:02:17.647 [364/707] Generating lib/regexdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:17.647 [365/707] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_hmc.c.o 00:02:17.647 [366/707] Generating lib/efd.sym_chk with a custom command (wrapped by meson to capture output) 00:02:17.647 [367/707] Compiling C object lib/librte_node.a.p/node_ethdev_rx.c.o 00:02:17.647 [368/707] Compiling C object lib/librte_vhost.a.p/vhost_vhost.c.o 00:02:17.647 [369/707] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_uio.c.o 00:02:17.647 [370/707] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci.c.o 00:02:17.647 [371/707] Compiling C object lib/librte_table.a.p/table_rte_table_hash_key8.c.o 00:02:17.647 [372/707] Compiling C object lib/librte_node.a.p/node_ip4_reassembly.c.o 00:02:17.647 [373/707] Compiling C object lib/librte_node.a.p/node_ip4_local.c.o 00:02:17.647 [374/707] Generating lib/rib.sym_chk with a custom command (wrapped by meson to capture output) 00:02:17.647 [375/707] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev.c.o 00:02:17.647 [376/707] Compiling C object lib/librte_pdump.a.p/pdump_rte_pdump.c.o 00:02:17.647 [377/707] Linking static target drivers/libtmp_rte_bus_vdev.a 00:02:17.647 [378/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_evt_test.c.o 00:02:17.647 [379/707] Linking static target lib/librte_pdump.a 00:02:17.647 [380/707] Compiling C object lib/librte_acl.a.p/acl_acl_run_sse.c.o 00:02:17.647 [381/707] Generating lib/gpudev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:17.647 [382/707] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common.c.o 00:02:17.647 [383/707] Compiling C object lib/librte_graph.a.p/graph_rte_graph_model_mcore_dispatch.c.o 00:02:17.647 [384/707] Compiling C object app/dpdk-graph.p/graph_cli.c.o 00:02:17.647 [385/707] Compiling C object lib/librte_vhost.a.p/vhost_vhost_user.c.o 00:02:17.647 [386/707] Linking static target lib/librte_graph.a 00:02:17.647 [387/707] Compiling C object lib/librte_table.a.p/table_rte_table_hash_key16.c.o 00:02:17.909 [388/707] Compiling C object app/dpdk-graph.p/graph_ip4_route.c.o 00:02:17.909 [389/707] Compiling C object app/dpdk-graph.p/graph_ethdev_rx.c.o 00:02:17.909 [390/707] Compiling C object app/dpdk-graph.p/graph_mempool.c.o 00:02:17.909 [391/707] Compiling C object lib/librte_table.a.p/table_rte_table_hash_ext.c.o 00:02:17.909 [392/707] Compiling C object app/dpdk-graph.p/graph_conn.c.o 00:02:17.909 [393/707] Compiling C object app/dpdk-graph.p/graph_ip6_route.c.o 00:02:17.909 [394/707] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_dcb.c.o 00:02:17.909 [395/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_parser.c.o 00:02:17.909 [396/707] Compiling C object lib/librte_table.a.p/table_rte_table_hash_lru.c.o 00:02:17.909 [397/707] Compiling C object app/dpdk-graph.p/graph_main.c.o 00:02:17.909 [398/707] Compiling C object lib/librte_node.a.p/node_udp4_input.c.o 00:02:17.909 [399/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_ml_test.c.o 00:02:17.909 [400/707] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_vfio.c.o 00:02:17.909 [401/707] Compiling C object app/dpdk-graph.p/graph_utils.c.o 00:02:17.909 [402/707] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_main.c.o 00:02:17.909 [403/707] Linking static target drivers/libtmp_rte_bus_pci.a 00:02:17.909 [404/707] Compiling C object lib/librte_node.a.p/node_pkt_cls.c.o 00:02:17.909 [405/707] Compiling C object app/dpdk-graph.p/graph_l3fwd.c.o 00:02:17.909 [406/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_vf_representor.c.o 00:02:17.909 [407/707] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_lan_hmc.c.o 00:02:17.909 [408/707] Compiling C object lib/librte_table.a.p/table_rte_table_hash_key32.c.o 00:02:17.909 [409/707] Generating drivers/rte_bus_vdev.pmd.c with a custom command 00:02:17.909 [410/707] Compiling C object lib/librte_node.a.p/node_kernel_rx.c.o 00:02:17.909 [411/707] Linking static target lib/librte_table.a 00:02:17.909 [412/707] Compiling C object app/dpdk-graph.p/graph_graph.c.o 00:02:18.173 [413/707] Compiling C object drivers/librte_bus_vdev.a.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:02:18.173 [414/707] Compiling C object app/dpdk-graph.p/graph_neigh.c.o 00:02:18.173 [415/707] Linking static target drivers/librte_bus_vdev.a 00:02:18.173 [416/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_common.c.o 00:02:18.173 [417/707] Compiling C object drivers/librte_bus_vdev.so.24.0.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:02:18.173 [418/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_parser.c.o 00:02:18.173 [419/707] Generating lib/pdump.sym_chk with a custom command (wrapped by meson to capture output) 00:02:18.173 [420/707] Compiling C object lib/librte_sched.a.p/sched_rte_sched.c.o 00:02:18.173 [421/707] Linking static target lib/librte_sched.a 00:02:18.173 [422/707] Generating lib/fib.sym_chk with a custom command (wrapped by meson to capture output) 00:02:18.173 [423/707] Compiling C object app/dpdk-graph.p/graph_ethdev.c.o 00:02:18.173 [424/707] Compiling C object lib/librte_node.a.p/node_ip6_lookup.c.o 00:02:18.173 [425/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_recycle_mbufs_vec_common.c.o 00:02:18.173 [426/707] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_adminq.c.o 00:02:18.173 [427/707] Compiling C object lib/librte_cryptodev.a.p/cryptodev_rte_cryptodev.c.o 00:02:18.173 [428/707] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_main.c.o 00:02:18.173 [429/707] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_options_parse.c.o 00:02:18.173 [430/707] Linking static target lib/librte_cryptodev.a 00:02:18.173 [431/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_tm.c.o 00:02:18.173 [432/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_hash.c.o 00:02:18.173 [433/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_vectors.c.o 00:02:18.173 [434/707] Compiling C object app/dpdk-dumpcap.p/dumpcap_main.c.o 00:02:18.173 [435/707] Compiling C object lib/librte_ipsec.a.p/ipsec_esp_outb.c.o 00:02:18.435 [436/707] Generating drivers/rte_bus_pci.pmd.c with a custom command 00:02:18.435 [437/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_evt_main.c.o 00:02:18.435 [438/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_ml_main.c.o 00:02:18.435 [439/707] Compiling C object app/dpdk-test-acl.p/test-acl_main.c.o 00:02:18.435 [440/707] Compiling C object drivers/librte_bus_pci.a.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:02:18.435 [441/707] Compiling C object drivers/librte_bus_pci.so.24.0.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:02:18.435 [442/707] Linking static target drivers/librte_bus_pci.a 00:02:18.435 [443/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_vector_parsing.c.o 00:02:18.435 [444/707] Compiling C object app/dpdk-test-dma-perf.p/test-dma-perf_main.c.o 00:02:18.435 [445/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_evt_options.c.o 00:02:18.435 [446/707] Compiling C object lib/librte_node.a.p/node_ip4_lookup.c.o 00:02:18.435 [447/707] Compiling C object drivers/libtmp_rte_mempool_ring.a.p/mempool_ring_rte_mempool_ring.c.o 00:02:18.435 [448/707] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_ipsec.c.o 00:02:18.435 [449/707] Linking static target drivers/libtmp_rte_mempool_ring.a 00:02:18.435 [450/707] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_flow_gen.c.o 00:02:18.435 [451/707] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_nvm.c.o 00:02:18.435 [452/707] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_items_gen.c.o 00:02:18.435 [453/707] Compiling C object lib/librte_ipsec.a.p/ipsec_esp_inb.c.o 00:02:18.435 [454/707] Compiling C object lib/librte_member.a.p/member_rte_member_sketch.c.o 00:02:18.435 [455/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_ops.c.o 00:02:18.435 [456/707] Linking static target lib/librte_ipsec.a 00:02:18.435 [457/707] Linking static target lib/librte_member.a 00:02:18.435 [458/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_inference_ordered.c.o 00:02:18.435 [459/707] Compiling C object app/dpdk-test-gpudev.p/test-gpudev_main.c.o 00:02:18.435 [460/707] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_throughput.c.o 00:02:18.435 [461/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_inference_interleave.c.o 00:02:18.435 [462/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_model_common.c.o 00:02:18.435 [463/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_device_ops.c.o 00:02:18.435 [464/707] Generating drivers/rte_bus_vdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:18.696 [465/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_options_parsing.c.o 00:02:18.696 [466/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_main.c.o 00:02:18.696 [467/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_ml_options.c.o 00:02:18.696 [468/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_order_common.c.o 00:02:18.696 [469/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_pf.c.o 00:02:18.696 [470/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_stats.c.o 00:02:18.696 [471/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_common.c.o 00:02:18.696 [472/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_model_ops.c.o 00:02:18.696 [473/707] Generating lib/mldev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:18.696 [474/707] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_config.c.o 00:02:18.696 [475/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_throughput.c.o 00:02:18.696 [476/707] Compiling C object lib/librte_node.a.p/node_ip4_rewrite.c.o 00:02:18.696 [477/707] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_init.c.o 00:02:18.696 [478/707] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_stub.c.o 00:02:18.696 [479/707] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_acl.c.o 00:02:18.696 [480/707] Compiling C object lib/librte_node.a.p/node_ip6_rewrite.c.o 00:02:18.696 [481/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_order_atq.c.o 00:02:18.696 [482/707] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_ctl.c.o 00:02:18.696 [483/707] Compiling C object lib/librte_hash.a.p/hash_rte_cuckoo_hash.c.o 00:02:18.696 [484/707] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_main.c.o 00:02:18.696 [485/707] Generating lib/sched.sym_chk with a custom command (wrapped by meson to capture output) 00:02:18.696 [486/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_order_queue.c.o 00:02:18.696 [487/707] Linking static target lib/librte_node.a 00:02:18.696 [488/707] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_verify.c.o 00:02:18.696 [489/707] Linking static target lib/librte_hash.a 00:02:18.696 [490/707] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_lpm.c.o 00:02:18.696 [491/707] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_actions_gen.c.o 00:02:18.696 [492/707] Compiling C object lib/librte_pdcp.a.p/pdcp_pdcp_process.c.o 00:02:18.696 [493/707] Generating lib/graph.sym_chk with a custom command (wrapped by meson to capture output) 00:02:18.696 [494/707] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_lpm_ipv6.c.o 00:02:18.696 [495/707] Linking static target lib/librte_pdcp.a 00:02:18.696 [496/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_verify.c.o 00:02:18.696 [497/707] Generating drivers/rte_mempool_ring.pmd.c with a custom command 00:02:18.696 [498/707] Compiling C object drivers/librte_mempool_ring.a.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:02:18.696 [499/707] Compiling C object drivers/librte_mempool_ring.so.24.0.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:02:18.696 [500/707] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_test_bbdev.c.o 00:02:18.696 [501/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_pmd_cyclecount.c.o 00:02:18.696 [502/707] Linking static target drivers/librte_mempool_ring.a 00:02:18.696 [503/707] Compiling C object lib/acl/libavx2_tmp.a.p/acl_run_avx2.c.o 00:02:18.956 [504/707] Linking static target lib/acl/libavx2_tmp.a 00:02:18.956 [505/707] Compiling C object app/dpdk-test-dma-perf.p/test-dma-perf_benchmark.c.o 00:02:18.956 [506/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_latency.c.o 00:02:18.956 [507/707] Compiling C object lib/librte_port.a.p/port_rte_port_ring.c.o 00:02:18.956 [508/707] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_cman.c.o 00:02:18.956 [509/707] Linking static target lib/librte_port.a 00:02:18.956 [510/707] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_hash.c.o 00:02:18.956 [511/707] Generating lib/member.sym_chk with a custom command (wrapped by meson to capture output) 00:02:18.956 [512/707] Compiling C object app/dpdk-testpmd.p/test-pmd_cmd_flex_item.c.o 00:02:18.956 [513/707] Generating lib/ipsec.sym_chk with a custom command (wrapped by meson to capture output) 00:02:18.956 [514/707] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_test_bbdev_vector.c.o 00:02:18.956 [515/707] Compiling C object app/dpdk-testpmd.p/test-pmd_iofwd.c.o 00:02:18.956 [516/707] Compiling C object app/dpdk-testpmd.p/test-pmd_shared_rxq_fwd.c.o 00:02:18.956 [517/707] Compiling C object app/dpdk-proc-info.p/proc-info_main.c.o 00:02:18.956 [518/707] Generating drivers/rte_bus_pci.sym_chk with a custom command (wrapped by meson to capture output) 00:02:18.956 [519/707] Compiling C object app/dpdk-testpmd.p/test-pmd_rxonly.c.o 00:02:18.956 [520/707] Compiling C object app/dpdk-testpmd.p/test-pmd_5tswap.c.o 00:02:18.956 [521/707] Compiling C object app/dpdk-testpmd.p/test-pmd_macswap.c.o 00:02:18.956 [522/707] Compiling C object app/dpdk-testpmd.p/test-pmd_flowgen.c.o 00:02:18.956 [523/707] Generating lib/table.sym_chk with a custom command (wrapped by meson to capture output) 00:02:18.956 [524/707] Compiling C object app/dpdk-pdump.p/pdump_main.c.o 00:02:18.956 [525/707] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_eth_rx_adapter.c.o 00:02:18.956 [526/707] Compiling C object app/dpdk-testpmd.p/test-pmd_ieee1588fwd.c.o 00:02:18.956 [527/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_pipeline_common.c.o 00:02:18.956 [528/707] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_common.c.o 00:02:18.956 [529/707] Compiling C object app/dpdk-test-fib.p/test-fib_main.c.o 00:02:18.956 [530/707] Linking static target lib/librte_eventdev.a 00:02:18.956 [531/707] Generating lib/node.sym_chk with a custom command (wrapped by meson to capture output) 00:02:19.217 [532/707] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_cyclecount.c.o 00:02:19.217 [533/707] Compiling C object app/dpdk-testpmd.p/test-pmd_macfwd.c.o 00:02:19.217 [534/707] Compiling C object app/dpdk-testpmd.p/test-pmd_recycle_mbufs.c.o 00:02:19.217 [535/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_fdir.c.o 00:02:19.217 [536/707] Compiling C object app/dpdk-test-security-perf.p/test-security-perf_test_security_perf.c.o 00:02:19.217 [537/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_flow.c.o 00:02:19.217 [538/707] Compiling C object app/dpdk-testpmd.p/test-pmd_bpf_cmd.c.o 00:02:19.217 [539/707] Compiling C object app/dpdk-test-sad.p/test-sad_main.c.o 00:02:19.217 [540/707] Generating lib/pdcp.sym_chk with a custom command (wrapped by meson to capture output) 00:02:19.217 [541/707] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_runtime.c.o 00:02:19.217 [542/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_pipeline_queue.c.o 00:02:19.217 [543/707] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_mtr.c.o 00:02:19.217 [544/707] Compiling C object app/dpdk-testpmd.p/test-pmd_util.c.o 00:02:19.217 [545/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_pipeline_atq.c.o 00:02:19.217 [546/707] Compiling C object app/dpdk-testpmd.p/test-pmd_icmpecho.c.o 00:02:19.217 [547/707] Compiling C object drivers/net/i40e/libi40e_avx512_lib.a.p/i40e_rxtx_vec_avx512.c.o 00:02:19.217 [548/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_rxtx_vec_sse.c.o 00:02:19.217 [549/707] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_tm.c.o 00:02:19.217 [550/707] Linking static target drivers/net/i40e/libi40e_avx512_lib.a 00:02:19.217 [551/707] Compiling C object lib/librte_acl.a.p/acl_acl_run_avx512.c.o 00:02:19.217 [552/707] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_main.c.o 00:02:19.217 [553/707] Compiling C object app/dpdk-testpmd.p/.._drivers_net_i40e_i40e_testpmd.c.o 00:02:19.217 [554/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_rte_pmd_i40e.c.o 00:02:19.217 [555/707] Linking static target lib/librte_acl.a 00:02:19.477 [556/707] Compiling C object app/dpdk-testpmd.p/test-pmd_parameters.c.o 00:02:19.477 [557/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_perf_queue.c.o 00:02:19.477 [558/707] Generating lib/hash.sym_chk with a custom command (wrapped by meson to capture output) 00:02:19.477 [559/707] Compiling C object app/dpdk-test-regex.p/test-regex_main.c.o 00:02:19.477 [560/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_perf_atq.c.o 00:02:19.477 [561/707] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_pipeline_spec.c.o 00:02:19.477 [562/707] Compiling C object drivers/net/i40e/libi40e_avx2_lib.a.p/i40e_rxtx_vec_avx2.c.o 00:02:19.477 [563/707] Linking static target drivers/net/i40e/libi40e_avx2_lib.a 00:02:19.477 [564/707] Generating lib/port.sym_chk with a custom command (wrapped by meson to capture output) 00:02:19.477 [565/707] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_common.c.o 00:02:19.737 [566/707] Linking static target drivers/net/i40e/base/libi40e_base.a 00:02:19.737 [567/707] Compiling C object app/dpdk-test-security-perf.p/test_test_cryptodev_security_ipsec.c.o 00:02:19.737 [568/707] Compiling C object app/dpdk-testpmd.p/test-pmd_csumonly.c.o 00:02:19.737 [569/707] Generating lib/acl.sym_chk with a custom command (wrapped by meson to capture output) 00:02:19.737 [570/707] Compiling C object app/dpdk-testpmd.p/test-pmd_txonly.c.o 00:02:19.998 [571/707] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev.c.o 00:02:19.998 [572/707] Linking static target lib/librte_ethdev.a 00:02:19.998 [573/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_inference_common.c.o 00:02:19.998 [574/707] Generating lib/cryptodev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:19.998 [575/707] Compiling C object app/dpdk-testpmd.p/test-pmd_testpmd.c.o 00:02:20.257 [576/707] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline.c.o 00:02:20.517 [577/707] Compiling C object app/dpdk-testpmd.p/test-pmd_noisy_vnf.c.o 00:02:20.777 [578/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_rxtx.c.o 00:02:20.777 [579/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_perf_common.c.o 00:02:21.037 [580/707] Compiling C object app/dpdk-testpmd.p/test-pmd_config.c.o 00:02:21.296 [581/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_ethdev.c.o 00:02:21.296 [582/707] Linking static target drivers/libtmp_rte_net_i40e.a 00:02:21.556 [583/707] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_flow.c.o 00:02:21.556 [584/707] Generating drivers/rte_net_i40e.pmd.c with a custom command 00:02:21.817 [585/707] Compiling C object drivers/librte_net_i40e.a.p/meson-generated_.._rte_net_i40e.pmd.c.o 00:02:21.817 [586/707] Compiling C object drivers/librte_net_i40e.so.24.0.p/meson-generated_.._rte_net_i40e.pmd.c.o 00:02:21.817 [587/707] Linking static target drivers/librte_net_i40e.a 00:02:22.387 [588/707] Compiling C object lib/librte_vhost.a.p/vhost_vhost_crypto.c.o 00:02:22.647 [589/707] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_pipeline.c.o 00:02:22.908 [590/707] Generating lib/eventdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:22.908 [591/707] Generating drivers/rte_net_i40e.sym_chk with a custom command (wrapped by meson to capture output) 00:02:23.169 [592/707] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_test_bbdev_perf.c.o 00:02:29.748 [593/707] Generating lib/eal.sym_chk with a custom command (wrapped by meson to capture output) 00:02:29.748 [594/707] Linking target lib/librte_eal.so.24.0 00:02:29.748 [595/707] Generating symbol file lib/librte_eal.so.24.0.p/librte_eal.so.24.0.symbols 00:02:29.748 [596/707] Linking target lib/librte_meter.so.24.0 00:02:29.748 [597/707] Linking target lib/librte_jobstats.so.24.0 00:02:29.748 [598/707] Linking target lib/librte_ring.so.24.0 00:02:29.748 [599/707] Linking target lib/librte_pci.so.24.0 00:02:29.748 [600/707] Linking target lib/librte_timer.so.24.0 00:02:29.748 [601/707] Linking target lib/librte_cfgfile.so.24.0 00:02:29.748 [602/707] Linking target lib/librte_dmadev.so.24.0 00:02:29.748 [603/707] Linking target lib/librte_rawdev.so.24.0 00:02:29.748 [604/707] Linking target lib/librte_stack.so.24.0 00:02:29.748 [605/707] Linking target drivers/librte_bus_vdev.so.24.0 00:02:29.748 [606/707] Linking target lib/librte_acl.so.24.0 00:02:29.748 [607/707] Generating symbol file lib/librte_pci.so.24.0.p/librte_pci.so.24.0.symbols 00:02:29.748 [608/707] Generating symbol file lib/librte_acl.so.24.0.p/librte_acl.so.24.0.symbols 00:02:29.748 [609/707] Generating symbol file lib/librte_ring.so.24.0.p/librte_ring.so.24.0.symbols 00:02:29.748 [610/707] Generating symbol file lib/librte_meter.so.24.0.p/librte_meter.so.24.0.symbols 00:02:29.748 [611/707] Generating symbol file drivers/librte_bus_vdev.so.24.0.p/librte_bus_vdev.so.24.0.symbols 00:02:29.748 [612/707] Generating symbol file lib/librte_dmadev.so.24.0.p/librte_dmadev.so.24.0.symbols 00:02:29.748 [613/707] Generating symbol file lib/librte_timer.so.24.0.p/librte_timer.so.24.0.symbols 00:02:29.748 [614/707] Linking target drivers/librte_bus_pci.so.24.0 00:02:29.748 [615/707] Linking target lib/librte_mempool.so.24.0 00:02:29.748 [616/707] Linking target lib/librte_rcu.so.24.0 00:02:29.748 [617/707] Generating lib/ethdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:29.748 [618/707] Generating symbol file lib/librte_mempool.so.24.0.p/librte_mempool.so.24.0.symbols 00:02:29.748 [619/707] Generating symbol file lib/librte_rcu.so.24.0.p/librte_rcu.so.24.0.symbols 00:02:29.748 [620/707] Generating symbol file drivers/librte_bus_pci.so.24.0.p/librte_bus_pci.so.24.0.symbols 00:02:29.748 [621/707] Linking target drivers/librte_mempool_ring.so.24.0 00:02:29.748 [622/707] Linking target lib/librte_rib.so.24.0 00:02:29.748 [623/707] Linking target lib/librte_mbuf.so.24.0 00:02:29.748 [624/707] Generating symbol file lib/librte_rib.so.24.0.p/librte_rib.so.24.0.symbols 00:02:29.748 [625/707] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_table_action.c.o 00:02:29.748 [626/707] Generating symbol file lib/librte_mbuf.so.24.0.p/librte_mbuf.so.24.0.symbols 00:02:29.748 [627/707] Linking target lib/librte_fib.so.24.0 00:02:29.748 [628/707] Linking static target lib/librte_pipeline.a 00:02:29.748 [629/707] Linking target lib/librte_mldev.so.24.0 00:02:29.748 [630/707] Linking target lib/librte_regexdev.so.24.0 00:02:29.748 [631/707] Linking target lib/librte_distributor.so.24.0 00:02:29.748 [632/707] Linking target lib/librte_compressdev.so.24.0 00:02:29.748 [633/707] Linking target lib/librte_gpudev.so.24.0 00:02:29.748 [634/707] Linking target lib/librte_bbdev.so.24.0 00:02:29.748 [635/707] Linking target lib/librte_reorder.so.24.0 00:02:29.748 [636/707] Linking target lib/librte_net.so.24.0 00:02:29.748 [637/707] Linking target lib/librte_sched.so.24.0 00:02:29.748 [638/707] Linking target lib/librte_cryptodev.so.24.0 00:02:29.748 [639/707] Generating symbol file lib/librte_net.so.24.0.p/librte_net.so.24.0.symbols 00:02:29.748 [640/707] Generating symbol file lib/librte_sched.so.24.0.p/librte_sched.so.24.0.symbols 00:02:29.748 [641/707] Generating symbol file lib/librte_reorder.so.24.0.p/librte_reorder.so.24.0.symbols 00:02:29.748 [642/707] Generating symbol file lib/librte_cryptodev.so.24.0.p/librte_cryptodev.so.24.0.symbols 00:02:29.748 [643/707] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net.c.o 00:02:29.748 [644/707] Linking static target lib/librte_vhost.a 00:02:29.748 [645/707] Linking target lib/librte_cmdline.so.24.0 00:02:29.748 [646/707] Linking target lib/librte_hash.so.24.0 00:02:29.748 [647/707] Linking target lib/librte_security.so.24.0 00:02:29.748 [648/707] Linking target lib/librte_ethdev.so.24.0 00:02:29.748 [649/707] Generating symbol file lib/librte_security.so.24.0.p/librte_security.so.24.0.symbols 00:02:29.748 [650/707] Generating symbol file lib/librte_ethdev.so.24.0.p/librte_ethdev.so.24.0.symbols 00:02:30.008 [651/707] Generating symbol file lib/librte_hash.so.24.0.p/librte_hash.so.24.0.symbols 00:02:30.008 [652/707] Linking target lib/librte_pdcp.so.24.0 00:02:30.008 [653/707] Linking target lib/librte_pcapng.so.24.0 00:02:30.008 [654/707] Linking target lib/librte_metrics.so.24.0 00:02:30.008 [655/707] Linking target lib/librte_gro.so.24.0 00:02:30.008 [656/707] Linking target lib/librte_gso.so.24.0 00:02:30.008 [657/707] Linking target lib/librte_bpf.so.24.0 00:02:30.008 [658/707] Linking target lib/librte_efd.so.24.0 00:02:30.008 [659/707] Linking target lib/librte_ip_frag.so.24.0 00:02:30.008 [660/707] Linking target lib/librte_lpm.so.24.0 00:02:30.008 [661/707] Linking target lib/librte_power.so.24.0 00:02:30.008 [662/707] Linking target lib/librte_member.so.24.0 00:02:30.008 [663/707] Linking target lib/librte_ipsec.so.24.0 00:02:30.008 [664/707] Linking target lib/librte_eventdev.so.24.0 00:02:30.008 [665/707] Linking target drivers/librte_net_i40e.so.24.0 00:02:30.008 [666/707] Generating symbol file lib/librte_bpf.so.24.0.p/librte_bpf.so.24.0.symbols 00:02:30.008 [667/707] Generating symbol file lib/librte_pcapng.so.24.0.p/librte_pcapng.so.24.0.symbols 00:02:30.008 [668/707] Generating symbol file lib/librte_lpm.so.24.0.p/librte_lpm.so.24.0.symbols 00:02:30.008 [669/707] Generating symbol file lib/librte_ipsec.so.24.0.p/librte_ipsec.so.24.0.symbols 00:02:30.008 [670/707] Generating symbol file lib/librte_ip_frag.so.24.0.p/librte_ip_frag.so.24.0.symbols 00:02:30.008 [671/707] Generating symbol file lib/librte_metrics.so.24.0.p/librte_metrics.so.24.0.symbols 00:02:30.008 [672/707] Generating symbol file lib/librte_eventdev.so.24.0.p/librte_eventdev.so.24.0.symbols 00:02:30.268 [673/707] Linking target app/dpdk-dumpcap 00:02:30.268 [674/707] Linking target lib/librte_pdump.so.24.0 00:02:30.268 [675/707] Linking target lib/librte_latencystats.so.24.0 00:02:30.268 [676/707] Linking target lib/librte_graph.so.24.0 00:02:30.268 [677/707] Linking target lib/librte_bitratestats.so.24.0 00:02:30.268 [678/707] Linking target app/dpdk-test-cmdline 00:02:30.268 [679/707] Linking target app/dpdk-pdump 00:02:30.268 [680/707] Linking target app/dpdk-test-dma-perf 00:02:30.268 [681/707] Linking target app/dpdk-proc-info 00:02:30.268 [682/707] Linking target lib/librte_dispatcher.so.24.0 00:02:30.268 [683/707] Linking target app/dpdk-test-acl 00:02:30.268 [684/707] Linking target app/dpdk-test-gpudev 00:02:30.268 [685/707] Linking target app/dpdk-test-mldev 00:02:30.268 [686/707] Linking target app/dpdk-test-crypto-perf 00:02:30.268 [687/707] Linking target lib/librte_port.so.24.0 00:02:30.268 [688/707] Linking target app/dpdk-graph 00:02:30.268 [689/707] Linking target app/dpdk-test-compress-perf 00:02:30.268 [690/707] Linking target app/dpdk-test-fib 00:02:30.268 [691/707] Linking target app/dpdk-test-sad 00:02:30.268 [692/707] Linking target app/dpdk-test-regex 00:02:30.268 [693/707] Linking target app/dpdk-test-flow-perf 00:02:30.268 [694/707] Linking target app/dpdk-test-pipeline 00:02:30.268 [695/707] Linking target app/dpdk-test-security-perf 00:02:30.268 [696/707] Linking target app/dpdk-test-bbdev 00:02:30.268 [697/707] Linking target app/dpdk-test-eventdev 00:02:30.268 [698/707] Linking target app/dpdk-testpmd 00:02:30.268 [699/707] Generating symbol file lib/librte_graph.so.24.0.p/librte_graph.so.24.0.symbols 00:02:30.268 [700/707] Generating symbol file lib/librte_port.so.24.0.p/librte_port.so.24.0.symbols 00:02:30.268 [701/707] Linking target lib/librte_node.so.24.0 00:02:30.528 [702/707] Linking target lib/librte_table.so.24.0 00:02:30.528 [703/707] Generating symbol file lib/librte_table.so.24.0.p/librte_table.so.24.0.symbols 00:02:31.910 [704/707] Generating lib/vhost.sym_chk with a custom command (wrapped by meson to capture output) 00:02:31.910 [705/707] Linking target lib/librte_vhost.so.24.0 00:02:36.117 [706/707] Generating lib/pipeline.sym_chk with a custom command (wrapped by meson to capture output) 00:02:36.117 [707/707] Linking target lib/librte_pipeline.so.24.0 00:02:36.117 14:15:31 build_native_dpdk -- common/autobuild_common.sh@194 -- $ uname -s 00:02:36.117 14:15:31 build_native_dpdk -- common/autobuild_common.sh@194 -- $ [[ Linux == \F\r\e\e\B\S\D ]] 00:02:36.117 14:15:31 build_native_dpdk -- common/autobuild_common.sh@207 -- $ ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp -j112 install 00:02:36.117 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp' 00:02:36.117 [0/1] Installing files. 00:02:36.117 Installing subdir /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples 00:02:36.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/timer/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/timer 00:02:36.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/timer/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/timer 00:02:36.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool 00:02:36.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/lib/rte_ethtool.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/lib 00:02:36.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/lib/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/lib 00:02:36.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/lib/rte_ethtool.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/lib 00:02:36.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/ethapp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:02:36.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:02:36.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/ethapp.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:02:36.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:02:36.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sa.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:36.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec-secgw.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:36.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec-secgw.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:36.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/parser.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:36.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/rt.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:36.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sad.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:36.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipip.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:36.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/event_helper.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:36.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ep0.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:36.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:36.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/flow.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:36.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:36.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sad.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:36.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/parser.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:36.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:36.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ep1.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:36.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_worker.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:36.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/esp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:36.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/esp.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:36.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/event_helper.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:36.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_lpm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:36.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/flow.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:36.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sp6.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:36.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_process.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:36.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:36.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_worker.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:36.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sp4.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:36.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aescbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:36.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/pkttest.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:36.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesctr_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:36.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aescbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:36.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesgcm_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:36.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_3descbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:36.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/run_test.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:36.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesctr_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:36.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/bypass_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:36.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/linux_test.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:36.117 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesgcm_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:36.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/load_env.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:36.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_ipv6opts.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:36.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aescbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:36.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/common_defs_secgw.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:36.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/pkttest.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:36.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_3descbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:36.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesctr_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:36.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_null_header_reconstruct.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:36.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesgcm_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:36.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:36.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_3descbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:36.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/data_rxtx.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:36.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesctr_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:36.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aescbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:36.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_3descbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:36.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesgcm_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:36.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_crypto/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_crypto 00:02:36.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_crypto/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_crypto 00:02:36.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ptpclient/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ptpclient 00:02:36.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ptpclient/ptpclient.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ptpclient 00:02:36.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:36.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:36.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/perf_core.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:36.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/perf_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:36.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:36.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_acl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:36.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/em_default_v6.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:36.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:36.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:36.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:36.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:36.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/em_default_v4.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:36.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:36.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:36.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:36.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:36.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/em_route_parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:36.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:36.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_fib.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:36.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/lpm_default_v4.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:36.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_hlm_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:36.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:36.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:36.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_route.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:36.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_hlm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:36.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/lpm_default_v6.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:36.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:36.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event_generic.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:36.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:36.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_acl.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:36.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_sequential.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:36.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm_altivec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:36.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event_internal_port.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:36.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_acl_scalar.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:36.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_altivec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:36.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_hlm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:36.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:36.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/lpm_route_parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:36.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:02:36.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/l2fwd-cat.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:02:36.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/cat.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:02:36.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/cat.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:02:36.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bbdev_app/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bbdev_app 00:02:36.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bbdev_app/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bbdev_app 00:02:36.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/rxtx_callbacks/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/rxtx_callbacks 00:02:36.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/rxtx_callbacks/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/rxtx_callbacks 00:02:36.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd 00:02:36.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd 00:02:36.118 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_dev_self_test.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:36.119 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:36.119 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_aes.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:36.119 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_ecdsa.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:36.119 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:36.119 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_ccm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:36.119 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_xts.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:36.119 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_cmac.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:36.119 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:36.119 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_tdes.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:36.119 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_rsa.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:36.119 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_sha.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:36.119 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:36.119 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_gcm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:36.119 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_dev_self_test.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:36.119 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_hmac.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:36.119 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/service_cores/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/service_cores 00:02:36.119 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/service_cores/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/service_cores 00:02:36.119 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/dma/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/dma 00:02:36.119 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/dma/dmafwd.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/dma 00:02:36.119 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-graph/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-graph 00:02:36.119 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-graph/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-graph 00:02:36.119 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipv4_multicast/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipv4_multicast 00:02:36.119 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipv4_multicast/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipv4_multicast 00:02:36.119 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:02:36.119 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:02:36.119 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/shm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:02:36.119 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/shm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:02:36.119 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/ka-agent/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive/ka-agent 00:02:36.119 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/ka-agent/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive/ka-agent 00:02:36.119 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:02:36.119 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:02:36.119 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/rte_policer.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:02:36.119 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/rte_policer.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:02:36.119 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:02:36.119 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq 00:02:36.119 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq 00:02:36.119 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-crypto/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-crypto 00:02:36.119 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-crypto/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-crypto 00:02:36.119 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bond/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bond 00:02:36.119 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bond/commands.list to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bond 00:02:36.119 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bond/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bond 00:02:36.119 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/pkt_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common 00:02:36.119 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/altivec/port_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common/altivec 00:02:36.119 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/sse/port_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common/sse 00:02:36.119 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/neon/port_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common/neon 00:02:36.119 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/link_status_interrupt/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/link_status_interrupt 00:02:36.119 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/link_status_interrupt/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/link_status_interrupt 00:02:36.119 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/skeleton/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/skeleton 00:02:36.119 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/skeleton/basicfwd.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/skeleton 00:02:36.119 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:36.119 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/pipeline_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:36.119 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/pipeline_worker_generic.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:36.119 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/pipeline_worker_tx.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:36.119 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:36.119 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/obj.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:36.119 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/cli.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:36.119 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/conn.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:36.119 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/obj.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:36.119 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:36.119 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/cli.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:36.119 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/thread.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:36.119 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:36.119 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/thread.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:36.119 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/conn.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:36.119 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:36.119 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/varbit.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:36.119 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_macswp.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:36.119 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/recirculation.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:36.119 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan_pcap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:36.120 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_macswp.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:36.120 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/learner.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:36.120 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib_nexthop_group_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:36.120 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_pcap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:36.120 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/meter.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:36.120 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_macswp_pcap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:36.120 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/ipsec.io to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:36.120 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/selector.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:36.120 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:36.120 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/ipsec.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:36.120 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/ipsec_sa.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:36.120 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/varbit.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:36.120 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:36.120 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib_nexthop_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:36.120 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/selector.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:36.120 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan_table.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:36.120 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/hash_func.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:36.120 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/learner.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:36.120 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:36.120 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/mirroring.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:36.120 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/registers.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:36.120 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/ipsec.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:36.120 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:36.120 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/pcap.io to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:36.120 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/selector.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:36.120 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/rss.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:36.120 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/packet.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:36.120 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/meter.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:36.120 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib_routing_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:36.120 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/hash_func.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:36.120 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/mirroring.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:36.120 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:36.120 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/ethdev.io to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:36.120 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/recirculation.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:36.120 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:36.120 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/registers.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:36.120 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/rss.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:36.120 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd 00:02:36.120 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_node/node.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_node 00:02:36.120 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_node/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_node 00:02:36.120 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/shared/common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/shared 00:02:36.120 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_server/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_server 00:02:36.120 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_server/args.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_server 00:02:36.120 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_server/init.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_server 00:02:36.120 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_server/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_server 00:02:36.120 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_server/init.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_server 00:02:36.120 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_server/args.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_server 00:02:36.120 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_reassembly/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_reassembly 00:02:36.120 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_reassembly/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_reassembly 00:02:36.120 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-macsec/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-macsec 00:02:36.120 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-macsec/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-macsec 00:02:36.120 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/commands.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:36.120 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:36.120 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/parse_obj_list.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:36.120 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/parse_obj_list.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:36.120 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:36.120 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/commands.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:36.120 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/packet_ordering/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/packet_ordering 00:02:36.120 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/packet_ordering/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/packet_ordering 00:02:36.120 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-jobstats/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-jobstats 00:02:36.120 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-jobstats/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-jobstats 00:02:36.120 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vdpa/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:02:36.121 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vdpa/commands.list to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:02:36.121 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vdpa/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:02:36.121 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vdpa/vdpa_blk_compact.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:02:36.121 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process 00:02:36.121 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/commands.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:02:36.121 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:02:36.121 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/commands.list to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:02:36.121 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:02:36.121 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:02:36.121 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/commands.list to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:02:36.121 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/mp_commands.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:02:36.121 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:02:36.121 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/mp_commands.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:02:36.121 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/symmetric_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/symmetric_mp 00:02:36.121 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/symmetric_mp/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/symmetric_mp 00:02:36.121 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp 00:02:36.121 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/shared/common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/shared 00:02:36.121 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_client/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_client 00:02:36.121 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_client/client.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_client 00:02:36.121 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:36.121 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/args.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:36.121 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/init.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:36.121 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:36.121 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/init.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:36.121 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/args.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:36.121 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/blk_spec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:36.121 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:36.121 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/vhost_blk.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:36.121 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/blk.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:36.121 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/vhost_blk.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:36.121 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/vhost_blk_compat.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:36.121 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_fragmentation/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_fragmentation 00:02:36.121 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_fragmentation/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_fragmentation 00:02:36.121 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_filtering/flow_blocks.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering 00:02:36.121 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_filtering/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering 00:02:36.121 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_filtering/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering 00:02:36.121 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:02:36.121 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:02:36.121 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/virtio_net.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:02:36.121 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:02:36.121 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/helloworld/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/helloworld 00:02:36.121 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/helloworld/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/helloworld 00:02:36.121 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/app_thread.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:36.121 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:36.121 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:36.121 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/cfg_file.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:36.121 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/args.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:36.121 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile_pie.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:36.121 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/cmdline.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:36.121 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:36.121 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/cfg_file.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:36.121 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:36.121 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile_red.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:36.121 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile_ov.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:36.121 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/stats.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:36.122 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/init.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:36.122 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/parser.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:36.122 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/swq.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:36.122 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/mempool.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:36.122 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cli.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:36.122 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/conn.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:36.122 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:36.122 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cli.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:36.122 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tmgr.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:36.122 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/pipeline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:36.122 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/pipeline.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:36.122 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/parser.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:36.122 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/link.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:36.122 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tap.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:36.122 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:36.122 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cryptodev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:36.122 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tap.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:36.122 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/swq.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:36.122 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/thread.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:36.122 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:36.122 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/action.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:36.122 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/mempool.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:36.122 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/thread.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:36.122 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/action.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:36.122 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tmgr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:36.122 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/conn.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:36.122 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/link.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:36.122 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cryptodev.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:36.122 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/route_ecmp.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:36.122 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/firewall.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:36.122 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/flow.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:36.122 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/flow_crypto.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:36.122 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/route.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:36.122 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/rss.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:36.122 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/tap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:36.122 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/l2fwd.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:36.122 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/distributor/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/distributor 00:02:36.122 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/distributor/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/distributor 00:02:36.122 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/power_manager.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:36.122 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_monitor.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:36.122 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_manager.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:36.122 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:36.122 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/parse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:36.122 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/vm_power_cli.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:36.122 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/power_manager.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:36.122 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/oob_monitor_x86.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:36.122 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:36.122 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:36.122 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/vm_power_cli.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:36.122 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/oob_monitor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:36.122 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_monitor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:36.122 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/oob_monitor_nop.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:36.122 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_manager.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:36.122 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:36.122 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/parse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:36.122 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/vm_power_cli_guest.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:36.122 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/vm_power_cli_guest.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:36.122 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:36.122 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:36.122 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ntb/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ntb 00:02:36.122 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ntb/ntb_fwd.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ntb 00:02:36.122 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ntb/commands.list to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ntb 00:02:36.122 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/t2.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:02:36.122 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/README to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:02:36.122 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/dummy.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:02:36.122 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/t3.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:02:36.122 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/t1.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:02:36.123 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:36.123 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:36.123 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:36.123 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event_generic.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:36.123 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event_internal_port.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:36.123 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:36.123 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_common.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:36.123 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:36.123 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_poll.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:36.123 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_poll.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:36.123 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq_dcb/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq_dcb 00:02:36.123 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq_dcb/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq_dcb 00:02:36.123 Installing lib/librte_log.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.123 Installing lib/librte_log.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.123 Installing lib/librte_kvargs.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.123 Installing lib/librte_kvargs.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.123 Installing lib/librte_telemetry.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.123 Installing lib/librte_telemetry.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.123 Installing lib/librte_eal.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.123 Installing lib/librte_eal.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.123 Installing lib/librte_ring.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.123 Installing lib/librte_ring.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.123 Installing lib/librte_rcu.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.123 Installing lib/librte_rcu.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.123 Installing lib/librte_mempool.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.123 Installing lib/librte_mempool.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.123 Installing lib/librte_mbuf.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.123 Installing lib/librte_mbuf.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.123 Installing lib/librte_net.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.123 Installing lib/librte_net.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.123 Installing lib/librte_meter.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.123 Installing lib/librte_meter.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.123 Installing lib/librte_ethdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.123 Installing lib/librte_ethdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.123 Installing lib/librte_pci.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.123 Installing lib/librte_pci.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.123 Installing lib/librte_cmdline.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.123 Installing lib/librte_cmdline.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.123 Installing lib/librte_metrics.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.123 Installing lib/librte_metrics.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.123 Installing lib/librte_hash.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.123 Installing lib/librte_hash.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.123 Installing lib/librte_timer.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.123 Installing lib/librte_timer.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.123 Installing lib/librte_acl.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.123 Installing lib/librte_acl.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.123 Installing lib/librte_bbdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.123 Installing lib/librte_bbdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.123 Installing lib/librte_bitratestats.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.123 Installing lib/librte_bitratestats.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.123 Installing lib/librte_bpf.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.123 Installing lib/librte_bpf.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.123 Installing lib/librte_cfgfile.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.123 Installing lib/librte_cfgfile.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.123 Installing lib/librte_compressdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.123 Installing lib/librte_compressdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.123 Installing lib/librte_cryptodev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.123 Installing lib/librte_cryptodev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.123 Installing lib/librte_distributor.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.123 Installing lib/librte_distributor.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.123 Installing lib/librte_dmadev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.123 Installing lib/librte_dmadev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.123 Installing lib/librte_efd.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.123 Installing lib/librte_efd.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.123 Installing lib/librte_eventdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.123 Installing lib/librte_eventdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.123 Installing lib/librte_dispatcher.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.123 Installing lib/librte_dispatcher.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.123 Installing lib/librte_gpudev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.123 Installing lib/librte_gpudev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.123 Installing lib/librte_gro.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.123 Installing lib/librte_gro.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.123 Installing lib/librte_gso.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.123 Installing lib/librte_gso.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.123 Installing lib/librte_ip_frag.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.123 Installing lib/librte_ip_frag.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.123 Installing lib/librte_jobstats.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.123 Installing lib/librte_jobstats.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.123 Installing lib/librte_latencystats.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.390 Installing lib/librte_latencystats.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.390 Installing lib/librte_lpm.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.390 Installing lib/librte_lpm.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.390 Installing lib/librte_member.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.390 Installing lib/librte_member.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.390 Installing lib/librte_pcapng.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.390 Installing lib/librte_pcapng.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.390 Installing lib/librte_power.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.390 Installing lib/librte_power.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.390 Installing lib/librte_rawdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.390 Installing lib/librte_rawdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.390 Installing lib/librte_regexdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.390 Installing lib/librte_regexdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.390 Installing lib/librte_mldev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.390 Installing lib/librte_mldev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.390 Installing lib/librte_rib.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.390 Installing lib/librte_rib.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.390 Installing lib/librte_reorder.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.390 Installing lib/librte_reorder.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.390 Installing lib/librte_sched.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.390 Installing lib/librte_sched.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.390 Installing lib/librte_security.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.390 Installing lib/librte_security.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.390 Installing lib/librte_stack.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.390 Installing lib/librte_stack.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.390 Installing lib/librte_vhost.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.390 Installing lib/librte_vhost.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.390 Installing lib/librte_ipsec.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.390 Installing lib/librte_ipsec.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.390 Installing lib/librte_pdcp.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.390 Installing lib/librte_pdcp.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.390 Installing lib/librte_fib.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.390 Installing lib/librte_fib.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.390 Installing lib/librte_port.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.390 Installing lib/librte_port.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.390 Installing lib/librte_pdump.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.390 Installing lib/librte_pdump.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.390 Installing lib/librte_table.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.390 Installing lib/librte_table.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.390 Installing lib/librte_pipeline.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.390 Installing lib/librte_pipeline.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.390 Installing lib/librte_graph.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.390 Installing lib/librte_graph.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.390 Installing lib/librte_node.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.390 Installing lib/librte_node.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.390 Installing drivers/librte_bus_pci.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.390 Installing drivers/librte_bus_pci.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0 00:02:36.390 Installing drivers/librte_bus_vdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.390 Installing drivers/librte_bus_vdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0 00:02:36.390 Installing drivers/librte_mempool_ring.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.390 Installing drivers/librte_mempool_ring.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0 00:02:36.390 Installing drivers/librte_net_i40e.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.390 Installing drivers/librte_net_i40e.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0 00:02:36.390 Installing app/dpdk-dumpcap to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:36.390 Installing app/dpdk-graph to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:36.390 Installing app/dpdk-pdump to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:36.390 Installing app/dpdk-proc-info to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:36.390 Installing app/dpdk-test-acl to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:36.390 Installing app/dpdk-test-bbdev to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:36.390 Installing app/dpdk-test-cmdline to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:36.390 Installing app/dpdk-test-compress-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:36.390 Installing app/dpdk-test-crypto-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:36.390 Installing app/dpdk-test-dma-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:36.390 Installing app/dpdk-test-eventdev to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:36.390 Installing app/dpdk-test-fib to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:36.390 Installing app/dpdk-test-flow-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:36.390 Installing app/dpdk-test-gpudev to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:36.390 Installing app/dpdk-test-mldev to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:36.390 Installing app/dpdk-test-pipeline to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:36.390 Installing app/dpdk-testpmd to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:36.390 Installing app/dpdk-test-regex to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:36.390 Installing app/dpdk-test-sad to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:36.390 Installing app/dpdk-test-security-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:36.390 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/config/rte_config.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.390 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/log/rte_log.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.390 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/kvargs/rte_kvargs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.390 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/telemetry/rte_telemetry.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.390 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_atomic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:36.390 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_byteorder.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:36.390 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_cpuflags.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:36.390 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_cycles.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:36.390 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_io.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:36.390 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_memcpy.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:36.390 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_pause.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:36.390 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_power_intrinsics.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:36.390 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_prefetch.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:36.390 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_rwlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:36.390 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_spinlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:36.390 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_vect.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:36.390 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_atomic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.390 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_byteorder.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.390 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_cpuflags.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.391 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_cycles.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.391 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_io.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.391 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_memcpy.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.391 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_pause.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.391 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_power_intrinsics.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.391 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_prefetch.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.391 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_rtm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.391 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_rwlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.391 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_spinlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.391 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_vect.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.391 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_atomic_32.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.391 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_atomic_64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.391 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_byteorder_32.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.391 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_byteorder_64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.391 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_alarm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.391 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_bitmap.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.391 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_bitops.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.391 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_branch_prediction.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.391 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_bus.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.391 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_class.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.391 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.391 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_compat.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.391 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_debug.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.391 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_dev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.391 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_devargs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.391 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_eal.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.391 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_eal_memconfig.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.391 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_eal_trace.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.391 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_errno.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.391 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_epoll.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.391 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_fbarray.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.391 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_hexdump.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.391 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_hypervisor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.391 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_interrupts.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.391 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_keepalive.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.391 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_launch.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.391 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_lcore.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.391 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_lock_annotations.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.391 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_malloc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.391 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_mcslock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.391 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_memory.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.391 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_memzone.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.391 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_pci_dev_feature_defs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.391 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_pci_dev_features.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.391 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_per_lcore.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.391 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_pflock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.391 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_random.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.391 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_reciprocal.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.391 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_seqcount.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.391 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_seqlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.391 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_service.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.391 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_service_component.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.391 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_stdatomic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.391 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_string_fns.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.391 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_tailq.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.391 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_thread.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.391 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_ticketlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.391 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_time.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.391 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_trace.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.391 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_trace_point.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.391 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_trace_point_register.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.391 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_uuid.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.391 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_version.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.391 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_vfio.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.391 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/linux/include/rte_os.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.391 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.391 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.391 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_elem.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.391 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.391 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_c11_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.391 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_generic_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.391 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_hts.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.391 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_hts_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.391 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_peek.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.391 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_peek_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.391 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_peek_zc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.391 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_rts.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.392 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_rts_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.392 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rcu/rte_rcu_qsbr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.392 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mempool/rte_mempool.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.392 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mempool/rte_mempool_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.392 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.392 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.392 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_ptype.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.392 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_pool_ops.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.392 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_dyn.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.392 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ip.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.392 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_tcp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.392 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_udp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.392 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_tls.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.392 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_dtls.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.392 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_esp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.392 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_sctp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.392 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_icmp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.392 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_arp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.392 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ether.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.392 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_macsec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.392 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_vxlan.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.392 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_gre.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.392 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_gtp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.392 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_net.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.392 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_net_crc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.392 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_mpls.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.392 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_higig.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.392 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ecpri.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.392 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_pdcp_hdr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.392 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_geneve.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.392 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_l2tpv2.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.392 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ppp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.392 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ib.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.392 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/meter/rte_meter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.392 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_cman.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.392 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.392 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_ethdev_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.392 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_dev_info.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.392 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_flow.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.392 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_flow_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.392 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_mtr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.392 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_mtr_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.392 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_tm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.392 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_tm_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.392 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_ethdev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.392 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_eth_ctrl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.392 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pci/rte_pci.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.392 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.392 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.392 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_num.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.392 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_ipaddr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.392 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_etheraddr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.392 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_string.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.392 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_rdline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.392 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_vt100.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.392 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_socket.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.392 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_cirbuf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.392 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_portlist.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.392 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/metrics/rte_metrics.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.392 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/metrics/rte_metrics_telemetry.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.392 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_fbk_hash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.392 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_hash_crc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.392 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_hash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.392 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_jhash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.392 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_thash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.392 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_thash_gfni.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.392 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_arm64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.392 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_generic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.392 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_sw.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.392 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_x86.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.392 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_thash_x86_gfni.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.392 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/timer/rte_timer.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.392 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/acl/rte_acl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.392 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/acl/rte_acl_osdep.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.392 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bbdev/rte_bbdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.392 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bbdev/rte_bbdev_pmd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.392 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bbdev/rte_bbdev_op.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.392 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bitratestats/rte_bitrate.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.392 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bpf/bpf_def.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.392 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bpf/rte_bpf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.393 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bpf/rte_bpf_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.393 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cfgfile/rte_cfgfile.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.393 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/compressdev/rte_compressdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.393 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/compressdev/rte_comp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.393 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.393 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.393 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_crypto.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.393 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_crypto_sym.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.393 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_crypto_asym.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.393 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.393 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/distributor/rte_distributor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.393 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/dmadev/rte_dmadev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.393 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/dmadev/rte_dmadev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.393 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/efd/rte_efd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.393 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_crypto_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.393 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_dma_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.393 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_eth_rx_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.393 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_eth_tx_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.393 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.393 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_timer_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.393 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_eventdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.393 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_eventdev_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.393 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_eventdev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.393 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/dispatcher/rte_dispatcher.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.393 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/gpudev/rte_gpudev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.393 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/gro/rte_gro.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.393 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/gso/rte_gso.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.393 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ip_frag/rte_ip_frag.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.393 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/jobstats/rte_jobstats.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.393 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/latencystats/rte_latencystats.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.393 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.393 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.393 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_altivec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.393 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.393 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_scalar.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.393 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.393 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_sve.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.393 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/member/rte_member.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.393 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pcapng/rte_pcapng.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.393 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.393 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power_guest_channel.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.393 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power_pmd_mgmt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.393 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power_uncore.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.393 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rawdev/rte_rawdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.393 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rawdev/rte_rawdev_pmd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.393 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/regexdev/rte_regexdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.393 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/regexdev/rte_regexdev_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.393 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/regexdev/rte_regexdev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.393 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mldev/rte_mldev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.393 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mldev/rte_mldev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.393 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rib/rte_rib.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.393 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rib/rte_rib6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.393 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/reorder/rte_reorder.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.393 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_approx.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.393 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_red.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.393 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_sched.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.393 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_sched_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.393 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_pie.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.393 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/security/rte_security.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.393 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/security/rte_security_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.393 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.393 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_std.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.393 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.393 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf_generic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.393 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf_c11.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.393 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf_stubs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.393 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vdpa.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.393 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vhost.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.393 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vhost_async.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.393 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vhost_crypto.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.393 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.393 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec_sa.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.393 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec_sad.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.393 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.393 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pdcp/rte_pdcp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.393 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pdcp/rte_pdcp_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.393 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/fib/rte_fib.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.393 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/fib/rte_fib6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.393 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.394 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_fd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.394 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_frag.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.394 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_ras.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.394 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.394 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.394 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_sched.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.394 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_source_sink.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.394 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_sym_crypto.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.394 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_eventdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.394 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.394 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.394 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_fd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.394 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.394 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_source_sink.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.394 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pdump/rte_pdump.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.394 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_lru.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.394 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_hash_func.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.394 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.394 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_em.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.394 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_learner.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.394 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_selector.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.394 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_wm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.394 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.394 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_acl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.394 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_array.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.394 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.394 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash_cuckoo.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.394 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash_func.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.394 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_lpm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.394 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_lpm_ipv6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.394 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_stub.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.394 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_lru_arm64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.394 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_lru_x86.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.394 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash_func_arm64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.394 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_pipeline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.394 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_port_in_action.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.394 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_table_action.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.394 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_swx_ipsec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.394 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_swx_pipeline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.394 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_swx_extern.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.394 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_swx_ctl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.394 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/graph/rte_graph.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.394 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/graph/rte_graph_worker.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.394 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/graph/rte_graph_model_mcore_dispatch.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.394 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/graph/rte_graph_model_rtc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.394 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/graph/rte_graph_worker_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.394 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/node/rte_node_eth_api.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.394 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/node/rte_node_ip4_api.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.394 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/node/rte_node_ip6_api.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.394 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/node/rte_node_udp4_input_api.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.394 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/drivers/bus/pci/rte_bus_pci.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.394 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/drivers/bus/vdev/rte_bus_vdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.394 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/drivers/net/i40e/rte_pmd_i40e.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.394 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/buildtools/dpdk-cmdline-gen.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:36.394 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-devbind.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:36.394 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-pmdinfo.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:36.394 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-telemetry.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:36.394 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-hugepages.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:36.394 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-rss-flows.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:36.394 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp/rte_build_config.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.394 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp/meson-private/libdpdk-libs.pc to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig 00:02:36.394 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp/meson-private/libdpdk.pc to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig 00:02:36.394 Installing symlink pointing to librte_log.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_log.so.24 00:02:36.394 Installing symlink pointing to librte_log.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_log.so 00:02:36.394 Installing symlink pointing to librte_kvargs.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_kvargs.so.24 00:02:36.395 Installing symlink pointing to librte_kvargs.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_kvargs.so 00:02:36.395 Installing symlink pointing to librte_telemetry.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_telemetry.so.24 00:02:36.395 Installing symlink pointing to librte_telemetry.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_telemetry.so 00:02:36.395 Installing symlink pointing to librte_eal.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eal.so.24 00:02:36.395 Installing symlink pointing to librte_eal.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eal.so 00:02:36.395 Installing symlink pointing to librte_ring.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ring.so.24 00:02:36.395 Installing symlink pointing to librte_ring.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ring.so 00:02:36.395 Installing symlink pointing to librte_rcu.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rcu.so.24 00:02:36.395 Installing symlink pointing to librte_rcu.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rcu.so 00:02:36.395 Installing symlink pointing to librte_mempool.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mempool.so.24 00:02:36.395 Installing symlink pointing to librte_mempool.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mempool.so 00:02:36.395 Installing symlink pointing to librte_mbuf.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mbuf.so.24 00:02:36.395 Installing symlink pointing to librte_mbuf.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mbuf.so 00:02:36.395 Installing symlink pointing to librte_net.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_net.so.24 00:02:36.395 Installing symlink pointing to librte_net.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_net.so 00:02:36.395 Installing symlink pointing to librte_meter.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_meter.so.24 00:02:36.395 Installing symlink pointing to librte_meter.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_meter.so 00:02:36.395 Installing symlink pointing to librte_ethdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ethdev.so.24 00:02:36.395 Installing symlink pointing to librte_ethdev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ethdev.so 00:02:36.395 Installing symlink pointing to librte_pci.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pci.so.24 00:02:36.395 Installing symlink pointing to librte_pci.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pci.so 00:02:36.395 Installing symlink pointing to librte_cmdline.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cmdline.so.24 00:02:36.395 Installing symlink pointing to librte_cmdline.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cmdline.so 00:02:36.395 Installing symlink pointing to librte_metrics.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_metrics.so.24 00:02:36.395 Installing symlink pointing to librte_metrics.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_metrics.so 00:02:36.395 Installing symlink pointing to librte_hash.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_hash.so.24 00:02:36.395 Installing symlink pointing to librte_hash.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_hash.so 00:02:36.395 Installing symlink pointing to librte_timer.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_timer.so.24 00:02:36.395 Installing symlink pointing to librte_timer.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_timer.so 00:02:36.395 Installing symlink pointing to librte_acl.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_acl.so.24 00:02:36.395 Installing symlink pointing to librte_acl.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_acl.so 00:02:36.395 Installing symlink pointing to librte_bbdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bbdev.so.24 00:02:36.395 Installing symlink pointing to librte_bbdev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bbdev.so 00:02:36.395 Installing symlink pointing to librte_bitratestats.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bitratestats.so.24 00:02:36.395 Installing symlink pointing to librte_bitratestats.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bitratestats.so 00:02:36.395 Installing symlink pointing to librte_bpf.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bpf.so.24 00:02:36.395 Installing symlink pointing to librte_bpf.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bpf.so 00:02:36.395 Installing symlink pointing to librte_cfgfile.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cfgfile.so.24 00:02:36.395 Installing symlink pointing to librte_cfgfile.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cfgfile.so 00:02:36.395 Installing symlink pointing to librte_compressdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_compressdev.so.24 00:02:36.395 Installing symlink pointing to librte_compressdev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_compressdev.so 00:02:36.395 Installing symlink pointing to librte_cryptodev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cryptodev.so.24 00:02:36.395 Installing symlink pointing to librte_cryptodev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cryptodev.so 00:02:36.395 Installing symlink pointing to librte_distributor.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_distributor.so.24 00:02:36.395 Installing symlink pointing to librte_distributor.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_distributor.so 00:02:36.395 Installing symlink pointing to librte_dmadev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_dmadev.so.24 00:02:36.395 Installing symlink pointing to librte_dmadev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_dmadev.so 00:02:36.395 Installing symlink pointing to librte_efd.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_efd.so.24 00:02:36.395 Installing symlink pointing to librte_efd.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_efd.so 00:02:36.395 Installing symlink pointing to librte_eventdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eventdev.so.24 00:02:36.395 Installing symlink pointing to librte_eventdev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eventdev.so 00:02:36.395 Installing symlink pointing to librte_dispatcher.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_dispatcher.so.24 00:02:36.395 Installing symlink pointing to librte_dispatcher.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_dispatcher.so 00:02:36.395 Installing symlink pointing to librte_gpudev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gpudev.so.24 00:02:36.395 Installing symlink pointing to librte_gpudev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gpudev.so 00:02:36.395 Installing symlink pointing to librte_gro.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gro.so.24 00:02:36.395 Installing symlink pointing to librte_gro.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gro.so 00:02:36.395 Installing symlink pointing to librte_gso.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gso.so.24 00:02:36.395 Installing symlink pointing to librte_gso.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gso.so 00:02:36.395 Installing symlink pointing to librte_ip_frag.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ip_frag.so.24 00:02:36.395 Installing symlink pointing to librte_ip_frag.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ip_frag.so 00:02:36.395 Installing symlink pointing to librte_jobstats.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_jobstats.so.24 00:02:36.395 Installing symlink pointing to librte_jobstats.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_jobstats.so 00:02:36.395 Installing symlink pointing to librte_latencystats.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_latencystats.so.24 00:02:36.395 Installing symlink pointing to librte_latencystats.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_latencystats.so 00:02:36.395 Installing symlink pointing to librte_lpm.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_lpm.so.24 00:02:36.395 Installing symlink pointing to librte_lpm.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_lpm.so 00:02:36.395 Installing symlink pointing to librte_member.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_member.so.24 00:02:36.395 Installing symlink pointing to librte_member.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_member.so 00:02:36.395 Installing symlink pointing to librte_pcapng.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pcapng.so.24 00:02:36.395 Installing symlink pointing to librte_pcapng.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pcapng.so 00:02:36.395 Installing symlink pointing to librte_power.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_power.so.24 00:02:36.395 Installing symlink pointing to librte_power.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_power.so 00:02:36.395 Installing symlink pointing to librte_rawdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rawdev.so.24 00:02:36.395 Installing symlink pointing to librte_rawdev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rawdev.so 00:02:36.395 Installing symlink pointing to librte_regexdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_regexdev.so.24 00:02:36.395 Installing symlink pointing to librte_regexdev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_regexdev.so 00:02:36.395 Installing symlink pointing to librte_mldev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mldev.so.24 00:02:36.395 Installing symlink pointing to librte_mldev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mldev.so 00:02:36.395 Installing symlink pointing to librte_rib.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rib.so.24 00:02:36.395 Installing symlink pointing to librte_rib.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rib.so 00:02:36.395 Installing symlink pointing to librte_reorder.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_reorder.so.24 00:02:36.395 Installing symlink pointing to librte_reorder.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_reorder.so 00:02:36.395 Installing symlink pointing to librte_sched.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_sched.so.24 00:02:36.395 Installing symlink pointing to librte_sched.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_sched.so 00:02:36.395 Installing symlink pointing to librte_security.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_security.so.24 00:02:36.395 Installing symlink pointing to librte_security.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_security.so 00:02:36.395 Installing symlink pointing to librte_stack.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_stack.so.24 00:02:36.395 Installing symlink pointing to librte_stack.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_stack.so 00:02:36.395 Installing symlink pointing to librte_vhost.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_vhost.so.24 00:02:36.396 Installing symlink pointing to librte_vhost.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_vhost.so 00:02:36.396 Installing symlink pointing to librte_ipsec.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ipsec.so.24 00:02:36.396 Installing symlink pointing to librte_ipsec.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ipsec.so 00:02:36.396 Installing symlink pointing to librte_pdcp.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pdcp.so.24 00:02:36.396 Installing symlink pointing to librte_pdcp.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pdcp.so 00:02:36.396 Installing symlink pointing to librte_fib.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_fib.so.24 00:02:36.396 Installing symlink pointing to librte_fib.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_fib.so 00:02:36.396 Installing symlink pointing to librte_port.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_port.so.24 00:02:36.396 Installing symlink pointing to librte_port.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_port.so 00:02:36.396 Installing symlink pointing to librte_pdump.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pdump.so.24 00:02:36.396 Installing symlink pointing to librte_pdump.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pdump.so 00:02:36.396 Installing symlink pointing to librte_table.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_table.so.24 00:02:36.396 Installing symlink pointing to librte_table.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_table.so 00:02:36.396 Installing symlink pointing to librte_pipeline.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pipeline.so.24 00:02:36.396 Installing symlink pointing to librte_pipeline.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pipeline.so 00:02:36.396 Installing symlink pointing to librte_graph.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_graph.so.24 00:02:36.396 Installing symlink pointing to librte_graph.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_graph.so 00:02:36.396 Installing symlink pointing to librte_node.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_node.so.24 00:02:36.396 './librte_bus_pci.so' -> 'dpdk/pmds-24.0/librte_bus_pci.so' 00:02:36.396 './librte_bus_pci.so.24' -> 'dpdk/pmds-24.0/librte_bus_pci.so.24' 00:02:36.396 './librte_bus_pci.so.24.0' -> 'dpdk/pmds-24.0/librte_bus_pci.so.24.0' 00:02:36.396 './librte_bus_vdev.so' -> 'dpdk/pmds-24.0/librte_bus_vdev.so' 00:02:36.396 './librte_bus_vdev.so.24' -> 'dpdk/pmds-24.0/librte_bus_vdev.so.24' 00:02:36.396 './librte_bus_vdev.so.24.0' -> 'dpdk/pmds-24.0/librte_bus_vdev.so.24.0' 00:02:36.396 './librte_mempool_ring.so' -> 'dpdk/pmds-24.0/librte_mempool_ring.so' 00:02:36.396 './librte_mempool_ring.so.24' -> 'dpdk/pmds-24.0/librte_mempool_ring.so.24' 00:02:36.396 './librte_mempool_ring.so.24.0' -> 'dpdk/pmds-24.0/librte_mempool_ring.so.24.0' 00:02:36.396 './librte_net_i40e.so' -> 'dpdk/pmds-24.0/librte_net_i40e.so' 00:02:36.396 './librte_net_i40e.so.24' -> 'dpdk/pmds-24.0/librte_net_i40e.so.24' 00:02:36.396 './librte_net_i40e.so.24.0' -> 'dpdk/pmds-24.0/librte_net_i40e.so.24.0' 00:02:36.396 Installing symlink pointing to librte_node.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_node.so 00:02:36.396 Installing symlink pointing to librte_bus_pci.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_bus_pci.so.24 00:02:36.396 Installing symlink pointing to librte_bus_pci.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_bus_pci.so 00:02:36.396 Installing symlink pointing to librte_bus_vdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_bus_vdev.so.24 00:02:36.396 Installing symlink pointing to librte_bus_vdev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_bus_vdev.so 00:02:36.396 Installing symlink pointing to librte_mempool_ring.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_mempool_ring.so.24 00:02:36.396 Installing symlink pointing to librte_mempool_ring.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_mempool_ring.so 00:02:36.396 Installing symlink pointing to librte_net_i40e.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_net_i40e.so.24 00:02:36.396 Installing symlink pointing to librte_net_i40e.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_net_i40e.so 00:02:36.396 Running custom install script '/bin/sh /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/config/../buildtools/symlink-drivers-solibs.sh lib dpdk/pmds-24.0' 00:02:36.396 14:15:32 build_native_dpdk -- common/autobuild_common.sh@213 -- $ cat 00:02:36.396 14:15:32 build_native_dpdk -- common/autobuild_common.sh@218 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:02:36.396 00:02:36.396 real 0m28.737s 00:02:36.396 user 8m3.838s 00:02:36.396 sys 2m38.808s 00:02:36.396 14:15:32 build_native_dpdk -- common/autotest_common.sh@1130 -- $ xtrace_disable 00:02:36.396 14:15:32 build_native_dpdk -- common/autotest_common.sh@10 -- $ set +x 00:02:36.396 ************************************ 00:02:36.396 END TEST build_native_dpdk 00:02:36.396 ************************************ 00:02:36.396 14:15:32 -- spdk/autobuild.sh@31 -- $ case "$SPDK_TEST_AUTOBUILD" in 00:02:36.396 14:15:32 -- spdk/autobuild.sh@47 -- $ [[ 0 -eq 1 ]] 00:02:36.396 14:15:32 -- spdk/autobuild.sh@51 -- $ [[ 1 -eq 1 ]] 00:02:36.396 14:15:32 -- spdk/autobuild.sh@52 -- $ llvm_precompile 00:02:36.396 14:15:32 -- common/autobuild_common.sh@438 -- $ run_test autobuild_llvm_precompile _llvm_precompile 00:02:36.396 14:15:32 -- common/autotest_common.sh@1105 -- $ '[' 2 -le 1 ']' 00:02:36.396 14:15:32 -- common/autotest_common.sh@1111 -- $ xtrace_disable 00:02:36.396 14:15:32 -- common/autotest_common.sh@10 -- $ set +x 00:02:36.396 ************************************ 00:02:36.396 START TEST autobuild_llvm_precompile 00:02:36.396 ************************************ 00:02:36.396 14:15:32 autobuild_llvm_precompile -- common/autotest_common.sh@1129 -- $ _llvm_precompile 00:02:36.396 14:15:32 autobuild_llvm_precompile -- common/autobuild_common.sh@32 -- $ clang --version 00:02:36.396 14:15:32 autobuild_llvm_precompile -- common/autobuild_common.sh@32 -- $ [[ clang version 17.0.6 (Fedora 17.0.6-2.fc39) 00:02:36.396 Target: x86_64-redhat-linux-gnu 00:02:36.396 Thread model: posix 00:02:36.396 InstalledDir: /usr/bin =~ version (([0-9]+).([0-9]+).([0-9]+)) ]] 00:02:36.396 14:15:32 autobuild_llvm_precompile -- common/autobuild_common.sh@33 -- $ clang_num=17 00:02:36.396 14:15:32 autobuild_llvm_precompile -- common/autobuild_common.sh@35 -- $ export CC=clang-17 00:02:36.396 14:15:32 autobuild_llvm_precompile -- common/autobuild_common.sh@35 -- $ CC=clang-17 00:02:36.396 14:15:32 autobuild_llvm_precompile -- common/autobuild_common.sh@36 -- $ export CXX=clang++-17 00:02:36.396 14:15:32 autobuild_llvm_precompile -- common/autobuild_common.sh@36 -- $ CXX=clang++-17 00:02:36.396 14:15:32 autobuild_llvm_precompile -- common/autobuild_common.sh@38 -- $ fuzzer_libs=(/usr/lib*/clang/@("$clang_num"|"$clang_version")/lib/*linux*/libclang_rt.fuzzer_no_main?(-x86_64).a) 00:02:36.396 14:15:32 autobuild_llvm_precompile -- common/autobuild_common.sh@39 -- $ fuzzer_lib=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:02:36.396 14:15:32 autobuild_llvm_precompile -- common/autobuild_common.sh@40 -- $ [[ -e /usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a ]] 00:02:36.396 14:15:32 autobuild_llvm_precompile -- common/autobuild_common.sh@42 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user --with-fuzzer=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a' 00:02:36.396 14:15:32 autobuild_llvm_precompile -- common/autobuild_common.sh@44 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user --with-fuzzer=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:02:36.656 Using /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig for additional libs... 00:02:36.917 DPDK libraries: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:36.917 DPDK includes: //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:36.917 Using default SPDK env in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:02:37.488 Using 'verbs' RDMA provider 00:02:53.328 Configuring ISA-L (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.spdk-isal.log)...done. 00:03:08.231 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.spdk-isal-crypto.log)...done. 00:03:08.231 Creating mk/config.mk...done. 00:03:08.231 Creating mk/cc.flags.mk...done. 00:03:08.231 Type 'make' to build. 00:03:08.231 00:03:08.231 real 0m30.724s 00:03:08.231 user 0m13.396s 00:03:08.231 sys 0m16.809s 00:03:08.231 14:16:03 autobuild_llvm_precompile -- common/autotest_common.sh@1130 -- $ xtrace_disable 00:03:08.231 14:16:03 autobuild_llvm_precompile -- common/autotest_common.sh@10 -- $ set +x 00:03:08.231 ************************************ 00:03:08.231 END TEST autobuild_llvm_precompile 00:03:08.231 ************************************ 00:03:08.231 14:16:03 -- spdk/autobuild.sh@55 -- $ [[ -n '' ]] 00:03:08.231 14:16:03 -- spdk/autobuild.sh@57 -- $ [[ 0 -eq 1 ]] 00:03:08.231 14:16:03 -- spdk/autobuild.sh@59 -- $ [[ 0 -eq 1 ]] 00:03:08.231 14:16:03 -- spdk/autobuild.sh@62 -- $ [[ 1 -eq 1 ]] 00:03:08.231 14:16:03 -- spdk/autobuild.sh@64 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user --with-fuzzer=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:03:08.231 Using /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig for additional libs... 00:03:08.231 DPDK libraries: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:03:08.231 DPDK includes: //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:03:08.231 Using default SPDK env in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:03:08.231 Using 'verbs' RDMA provider 00:03:21.392 Configuring ISA-L (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.spdk-isal.log)...done. 00:03:33.640 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.spdk-isal-crypto.log)...done. 00:03:33.640 Creating mk/config.mk...done. 00:03:33.640 Creating mk/cc.flags.mk...done. 00:03:33.640 Type 'make' to build. 00:03:33.640 14:16:29 -- spdk/autobuild.sh@70 -- $ run_test make make -j112 00:03:33.640 14:16:29 -- common/autotest_common.sh@1105 -- $ '[' 3 -le 1 ']' 00:03:33.640 14:16:29 -- common/autotest_common.sh@1111 -- $ xtrace_disable 00:03:33.640 14:16:29 -- common/autotest_common.sh@10 -- $ set +x 00:03:33.640 ************************************ 00:03:33.640 START TEST make 00:03:33.640 ************************************ 00:03:33.640 14:16:29 make -- common/autotest_common.sh@1129 -- $ make -j112 00:03:33.640 make[1]: Nothing to be done for 'all'. 00:03:35.547 The Meson build system 00:03:35.547 Version: 1.5.0 00:03:35.547 Source dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user 00:03:35.547 Build dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:03:35.547 Build type: native build 00:03:35.547 Project name: libvfio-user 00:03:35.547 Project version: 0.0.1 00:03:35.547 C compiler for the host machine: clang-17 (clang 17.0.6 "clang version 17.0.6 (Fedora 17.0.6-2.fc39)") 00:03:35.547 C linker for the host machine: clang-17 ld.bfd 2.40-14 00:03:35.547 Host machine cpu family: x86_64 00:03:35.548 Host machine cpu: x86_64 00:03:35.548 Run-time dependency threads found: YES 00:03:35.548 Library dl found: YES 00:03:35.548 Found pkg-config: YES (/usr/bin/pkg-config) 1.9.5 00:03:35.548 Run-time dependency json-c found: YES 0.17 00:03:35.548 Run-time dependency cmocka found: YES 1.1.7 00:03:35.548 Program pytest-3 found: NO 00:03:35.548 Program flake8 found: NO 00:03:35.548 Program misspell-fixer found: NO 00:03:35.548 Program restructuredtext-lint found: NO 00:03:35.548 Program valgrind found: YES (/usr/bin/valgrind) 00:03:35.548 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:03:35.548 Compiler for C supports arguments -Wmissing-declarations: YES 00:03:35.548 Compiler for C supports arguments -Wwrite-strings: YES 00:03:35.548 ../libvfio-user/test/meson.build:20: WARNING: Project targets '>= 0.53.0' but uses feature introduced in '0.57.0': exclude_suites arg in add_test_setup. 00:03:35.548 Program test-lspci.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user/test/test-lspci.sh) 00:03:35.548 Program test-linkage.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user/test/test-linkage.sh) 00:03:35.548 ../libvfio-user/test/py/meson.build:16: WARNING: Project targets '>= 0.53.0' but uses feature introduced in '0.57.0': exclude_suites arg in add_test_setup. 00:03:35.548 Build targets in project: 8 00:03:35.548 WARNING: Project specifies a minimum meson_version '>= 0.53.0' but uses features which were added in newer versions: 00:03:35.548 * 0.57.0: {'exclude_suites arg in add_test_setup'} 00:03:35.548 00:03:35.548 libvfio-user 0.0.1 00:03:35.548 00:03:35.548 User defined options 00:03:35.548 buildtype : debug 00:03:35.548 default_library: static 00:03:35.548 libdir : /usr/local/lib 00:03:35.548 00:03:35.548 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:03:35.548 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug' 00:03:35.548 [1/36] Compiling C object samples/client.p/.._lib_tran.c.o 00:03:35.548 [2/36] Compiling C object samples/shadow_ioeventfd_server.p/shadow_ioeventfd_server.c.o 00:03:35.548 [3/36] Compiling C object lib/libvfio-user.a.p/migration.c.o 00:03:35.548 [4/36] Compiling C object lib/libvfio-user.a.p/irq.c.o 00:03:35.548 [5/36] Compiling C object samples/null.p/null.c.o 00:03:35.548 [6/36] Compiling C object samples/gpio-pci-idio-16.p/gpio-pci-idio-16.c.o 00:03:35.548 [7/36] Compiling C object samples/lspci.p/lspci.c.o 00:03:35.548 [8/36] Compiling C object lib/libvfio-user.a.p/pci.c.o 00:03:35.548 [9/36] Compiling C object lib/libvfio-user.a.p/tran.c.o 00:03:35.548 [10/36] Compiling C object test/unit_tests.p/.._lib_irq.c.o 00:03:35.548 [11/36] Compiling C object samples/client.p/.._lib_migration.c.o 00:03:35.548 [12/36] Compiling C object test/unit_tests.p/.._lib_tran_pipe.c.o 00:03:35.548 [13/36] Compiling C object test/unit_tests.p/.._lib_tran.c.o 00:03:35.808 [14/36] Compiling C object test/unit_tests.p/.._lib_pci.c.o 00:03:35.808 [15/36] Compiling C object lib/libvfio-user.a.p/tran_sock.c.o 00:03:35.808 [16/36] Compiling C object test/unit_tests.p/.._lib_migration.c.o 00:03:35.808 [17/36] Compiling C object test/unit_tests.p/mocks.c.o 00:03:35.808 [18/36] Compiling C object lib/libvfio-user.a.p/pci_caps.c.o 00:03:35.808 [19/36] Compiling C object lib/libvfio-user.a.p/dma.c.o 00:03:35.808 [20/36] Compiling C object test/unit_tests.p/.._lib_tran_sock.c.o 00:03:35.808 [21/36] Compiling C object samples/server.p/server.c.o 00:03:35.808 [22/36] Compiling C object samples/client.p/.._lib_tran_sock.c.o 00:03:35.808 [23/36] Compiling C object test/unit_tests.p/.._lib_dma.c.o 00:03:35.808 [24/36] Compiling C object test/unit_tests.p/unit-tests.c.o 00:03:35.808 [25/36] Compiling C object test/unit_tests.p/.._lib_pci_caps.c.o 00:03:35.808 [26/36] Compiling C object samples/client.p/client.c.o 00:03:35.808 [27/36] Compiling C object lib/libvfio-user.a.p/libvfio-user.c.o 00:03:35.808 [28/36] Compiling C object test/unit_tests.p/.._lib_libvfio-user.c.o 00:03:35.808 [29/36] Linking static target lib/libvfio-user.a 00:03:35.808 [30/36] Linking target samples/client 00:03:35.808 [31/36] Linking target samples/shadow_ioeventfd_server 00:03:35.808 [32/36] Linking target samples/lspci 00:03:35.808 [33/36] Linking target test/unit_tests 00:03:35.808 [34/36] Linking target samples/server 00:03:35.808 [35/36] Linking target samples/null 00:03:35.808 [36/36] Linking target samples/gpio-pci-idio-16 00:03:35.808 INFO: autodetecting backend as ninja 00:03:35.808 INFO: calculating backend command to run: /usr/local/bin/ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:03:35.808 DESTDIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user meson install --quiet -C /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:03:36.377 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug' 00:03:36.377 ninja: no work to do. 00:03:48.600 CC lib/ut_mock/mock.o 00:03:48.600 CC lib/log/log.o 00:03:48.600 CC lib/log/log_flags.o 00:03:48.600 CC lib/log/log_deprecated.o 00:03:48.600 CC lib/ut/ut.o 00:03:48.600 LIB libspdk_ut_mock.a 00:03:48.600 LIB libspdk_log.a 00:03:48.600 LIB libspdk_ut.a 00:03:48.860 CC lib/dma/dma.o 00:03:48.860 CC lib/util/base64.o 00:03:48.860 CXX lib/trace_parser/trace.o 00:03:48.860 CC lib/util/bit_array.o 00:03:48.860 CC lib/util/cpuset.o 00:03:48.860 CC lib/ioat/ioat.o 00:03:48.860 CC lib/util/crc16.o 00:03:48.860 CC lib/util/crc32.o 00:03:48.860 CC lib/util/crc32c.o 00:03:48.860 CC lib/util/crc32_ieee.o 00:03:48.860 CC lib/util/crc64.o 00:03:48.860 CC lib/util/dif.o 00:03:48.860 CC lib/util/fd.o 00:03:48.860 CC lib/util/fd_group.o 00:03:48.860 CC lib/util/file.o 00:03:48.860 CC lib/util/hexlify.o 00:03:48.860 CC lib/util/iov.o 00:03:48.861 CC lib/util/math.o 00:03:48.861 CC lib/util/net.o 00:03:48.861 CC lib/util/pipe.o 00:03:48.861 CC lib/util/strerror_tls.o 00:03:48.861 CC lib/util/string.o 00:03:48.861 CC lib/util/uuid.o 00:03:48.861 CC lib/util/zipf.o 00:03:48.861 CC lib/util/xor.o 00:03:48.861 CC lib/util/md5.o 00:03:49.121 CC lib/vfio_user/host/vfio_user_pci.o 00:03:49.121 CC lib/vfio_user/host/vfio_user.o 00:03:49.121 LIB libspdk_dma.a 00:03:49.121 LIB libspdk_ioat.a 00:03:49.121 LIB libspdk_vfio_user.a 00:03:49.121 LIB libspdk_util.a 00:03:49.381 LIB libspdk_trace_parser.a 00:03:49.641 CC lib/rdma_utils/rdma_utils.o 00:03:49.641 CC lib/json/json_parse.o 00:03:49.641 CC lib/conf/conf.o 00:03:49.641 CC lib/json/json_util.o 00:03:49.641 CC lib/json/json_write.o 00:03:49.641 CC lib/env_dpdk/env.o 00:03:49.641 CC lib/vmd/vmd.o 00:03:49.641 CC lib/idxd/idxd.o 00:03:49.641 CC lib/vmd/led.o 00:03:49.641 CC lib/env_dpdk/memory.o 00:03:49.641 CC lib/idxd/idxd_user.o 00:03:49.641 CC lib/env_dpdk/pci.o 00:03:49.641 CC lib/idxd/idxd_kernel.o 00:03:49.641 CC lib/env_dpdk/init.o 00:03:49.641 CC lib/env_dpdk/threads.o 00:03:49.641 CC lib/env_dpdk/pci_ioat.o 00:03:49.641 CC lib/env_dpdk/pci_virtio.o 00:03:49.641 CC lib/env_dpdk/pci_vmd.o 00:03:49.641 CC lib/env_dpdk/pci_idxd.o 00:03:49.641 CC lib/env_dpdk/pci_event.o 00:03:49.641 CC lib/env_dpdk/sigbus_handler.o 00:03:49.641 CC lib/env_dpdk/pci_dpdk.o 00:03:49.641 CC lib/env_dpdk/pci_dpdk_2207.o 00:03:49.641 CC lib/env_dpdk/pci_dpdk_2211.o 00:03:49.641 LIB libspdk_conf.a 00:03:49.641 LIB libspdk_rdma_utils.a 00:03:49.641 LIB libspdk_json.a 00:03:49.901 LIB libspdk_idxd.a 00:03:49.901 LIB libspdk_vmd.a 00:03:49.901 CC lib/jsonrpc/jsonrpc_server.o 00:03:49.901 CC lib/jsonrpc/jsonrpc_server_tcp.o 00:03:49.901 CC lib/jsonrpc/jsonrpc_client.o 00:03:49.901 CC lib/jsonrpc/jsonrpc_client_tcp.o 00:03:49.901 CC lib/rdma_provider/common.o 00:03:49.901 CC lib/rdma_provider/rdma_provider_verbs.o 00:03:50.161 LIB libspdk_rdma_provider.a 00:03:50.161 LIB libspdk_jsonrpc.a 00:03:50.421 LIB libspdk_env_dpdk.a 00:03:50.421 CC lib/rpc/rpc.o 00:03:50.681 LIB libspdk_rpc.a 00:03:50.941 CC lib/notify/notify.o 00:03:50.941 CC lib/notify/notify_rpc.o 00:03:50.941 CC lib/trace/trace.o 00:03:50.941 CC lib/trace/trace_flags.o 00:03:50.941 CC lib/trace/trace_rpc.o 00:03:50.941 CC lib/keyring/keyring.o 00:03:50.941 CC lib/keyring/keyring_rpc.o 00:03:51.201 LIB libspdk_notify.a 00:03:51.201 LIB libspdk_trace.a 00:03:51.201 LIB libspdk_keyring.a 00:03:51.461 CC lib/thread/thread.o 00:03:51.461 CC lib/thread/iobuf.o 00:03:51.461 CC lib/sock/sock.o 00:03:51.461 CC lib/sock/sock_rpc.o 00:03:51.721 LIB libspdk_sock.a 00:03:51.981 CC lib/nvme/nvme_ctrlr_cmd.o 00:03:51.981 CC lib/nvme/nvme_ctrlr.o 00:03:51.981 CC lib/nvme/nvme_fabric.o 00:03:51.981 CC lib/nvme/nvme_ns_cmd.o 00:03:51.981 CC lib/nvme/nvme_ns.o 00:03:51.981 CC lib/nvme/nvme_pcie_common.o 00:03:51.981 CC lib/nvme/nvme_pcie.o 00:03:51.981 CC lib/nvme/nvme_qpair.o 00:03:51.981 CC lib/nvme/nvme.o 00:03:51.981 CC lib/nvme/nvme_quirks.o 00:03:51.981 CC lib/nvme/nvme_transport.o 00:03:51.981 CC lib/nvme/nvme_discovery.o 00:03:51.981 CC lib/nvme/nvme_ctrlr_ocssd_cmd.o 00:03:51.981 CC lib/nvme/nvme_ns_ocssd_cmd.o 00:03:51.981 CC lib/nvme/nvme_tcp.o 00:03:51.981 CC lib/nvme/nvme_opal.o 00:03:51.981 CC lib/nvme/nvme_io_msg.o 00:03:51.981 CC lib/nvme/nvme_poll_group.o 00:03:51.981 CC lib/nvme/nvme_zns.o 00:03:51.981 CC lib/nvme/nvme_stubs.o 00:03:51.981 CC lib/nvme/nvme_auth.o 00:03:51.981 CC lib/nvme/nvme_cuse.o 00:03:51.981 CC lib/nvme/nvme_vfio_user.o 00:03:51.981 CC lib/nvme/nvme_rdma.o 00:03:52.241 LIB libspdk_thread.a 00:03:52.500 CC lib/accel/accel.o 00:03:52.500 CC lib/accel/accel_rpc.o 00:03:52.500 CC lib/accel/accel_sw.o 00:03:52.500 CC lib/fsdev/fsdev.o 00:03:52.500 CC lib/fsdev/fsdev_io.o 00:03:52.500 CC lib/fsdev/fsdev_rpc.o 00:03:52.500 CC lib/vfu_tgt/tgt_endpoint.o 00:03:52.500 CC lib/vfu_tgt/tgt_rpc.o 00:03:52.500 CC lib/init/json_config.o 00:03:52.500 CC lib/virtio/virtio.o 00:03:52.500 CC lib/init/subsystem.o 00:03:52.500 CC lib/virtio/virtio_vhost_user.o 00:03:52.500 CC lib/init/subsystem_rpc.o 00:03:52.500 CC lib/virtio/virtio_vfio_user.o 00:03:52.500 CC lib/init/rpc.o 00:03:52.500 CC lib/blob/blobstore.o 00:03:52.500 CC lib/virtio/virtio_pci.o 00:03:52.500 CC lib/blob/request.o 00:03:52.500 CC lib/blob/zeroes.o 00:03:52.500 CC lib/blob/blob_bs_dev.o 00:03:52.760 LIB libspdk_init.a 00:03:52.760 LIB libspdk_virtio.a 00:03:52.760 LIB libspdk_vfu_tgt.a 00:03:52.760 LIB libspdk_fsdev.a 00:03:53.020 CC lib/event/app.o 00:03:53.020 CC lib/event/reactor.o 00:03:53.020 CC lib/event/log_rpc.o 00:03:53.020 CC lib/event/app_rpc.o 00:03:53.020 CC lib/event/scheduler_static.o 00:03:53.278 CC lib/fuse_dispatcher/fuse_dispatcher.o 00:03:53.278 LIB libspdk_accel.a 00:03:53.278 LIB libspdk_event.a 00:03:53.537 LIB libspdk_nvme.a 00:03:53.537 CC lib/bdev/bdev.o 00:03:53.537 CC lib/bdev/bdev_rpc.o 00:03:53.537 CC lib/bdev/bdev_zone.o 00:03:53.537 CC lib/bdev/part.o 00:03:53.537 CC lib/bdev/scsi_nvme.o 00:03:53.537 LIB libspdk_fuse_dispatcher.a 00:03:54.106 LIB libspdk_blob.a 00:03:54.675 CC lib/blobfs/blobfs.o 00:03:54.675 CC lib/lvol/lvol.o 00:03:54.675 CC lib/blobfs/tree.o 00:03:54.936 LIB libspdk_lvol.a 00:03:54.936 LIB libspdk_blobfs.a 00:03:55.195 LIB libspdk_bdev.a 00:03:55.455 CC lib/nbd/nbd.o 00:03:55.455 CC lib/ublk/ublk.o 00:03:55.455 CC lib/nbd/nbd_rpc.o 00:03:55.455 CC lib/ublk/ublk_rpc.o 00:03:55.455 CC lib/scsi/dev.o 00:03:55.455 CC lib/scsi/port.o 00:03:55.455 CC lib/scsi/lun.o 00:03:55.455 CC lib/scsi/scsi.o 00:03:55.455 CC lib/scsi/scsi_bdev.o 00:03:55.455 CC lib/scsi/scsi_pr.o 00:03:55.455 CC lib/scsi/scsi_rpc.o 00:03:55.455 CC lib/nvmf/ctrlr.o 00:03:55.455 CC lib/scsi/task.o 00:03:55.455 CC lib/nvmf/ctrlr_discovery.o 00:03:55.455 CC lib/nvmf/ctrlr_bdev.o 00:03:55.455 CC lib/nvmf/subsystem.o 00:03:55.455 CC lib/ftl/ftl_core.o 00:03:55.455 CC lib/nvmf/nvmf.o 00:03:55.455 CC lib/ftl/ftl_init.o 00:03:55.455 CC lib/nvmf/nvmf_rpc.o 00:03:55.455 CC lib/ftl/ftl_debug.o 00:03:55.455 CC lib/ftl/ftl_layout.o 00:03:55.455 CC lib/nvmf/transport.o 00:03:55.455 CC lib/ftl/ftl_io.o 00:03:55.455 CC lib/nvmf/tcp.o 00:03:55.455 CC lib/ftl/ftl_sb.o 00:03:55.455 CC lib/nvmf/stubs.o 00:03:55.455 CC lib/ftl/ftl_l2p.o 00:03:55.455 CC lib/ftl/ftl_l2p_flat.o 00:03:55.455 CC lib/ftl/ftl_nv_cache.o 00:03:55.455 CC lib/nvmf/vfio_user.o 00:03:55.455 CC lib/nvmf/mdns_server.o 00:03:55.455 CC lib/ftl/ftl_band.o 00:03:55.455 CC lib/ftl/ftl_band_ops.o 00:03:55.455 CC lib/nvmf/rdma.o 00:03:55.455 CC lib/ftl/ftl_writer.o 00:03:55.455 CC lib/nvmf/auth.o 00:03:55.455 CC lib/ftl/ftl_rq.o 00:03:55.455 CC lib/ftl/ftl_reloc.o 00:03:55.455 CC lib/ftl/ftl_l2p_cache.o 00:03:55.455 CC lib/ftl/ftl_p2l.o 00:03:55.455 CC lib/ftl/ftl_p2l_log.o 00:03:55.455 CC lib/ftl/mngt/ftl_mngt.o 00:03:55.455 CC lib/ftl/mngt/ftl_mngt_bdev.o 00:03:55.455 CC lib/ftl/mngt/ftl_mngt_shutdown.o 00:03:55.455 CC lib/ftl/mngt/ftl_mngt_startup.o 00:03:55.455 CC lib/ftl/mngt/ftl_mngt_md.o 00:03:55.455 CC lib/ftl/mngt/ftl_mngt_misc.o 00:03:55.455 CC lib/ftl/mngt/ftl_mngt_ioch.o 00:03:55.455 CC lib/ftl/mngt/ftl_mngt_l2p.o 00:03:55.455 CC lib/ftl/mngt/ftl_mngt_band.o 00:03:55.455 CC lib/ftl/mngt/ftl_mngt_self_test.o 00:03:55.715 CC lib/ftl/mngt/ftl_mngt_p2l.o 00:03:55.715 CC lib/ftl/mngt/ftl_mngt_recovery.o 00:03:55.715 CC lib/ftl/mngt/ftl_mngt_upgrade.o 00:03:55.715 CC lib/ftl/utils/ftl_conf.o 00:03:55.715 CC lib/ftl/utils/ftl_md.o 00:03:55.715 CC lib/ftl/utils/ftl_mempool.o 00:03:55.715 CC lib/ftl/utils/ftl_bitmap.o 00:03:55.715 CC lib/ftl/utils/ftl_layout_tracker_bdev.o 00:03:55.715 CC lib/ftl/utils/ftl_property.o 00:03:55.715 CC lib/ftl/upgrade/ftl_layout_upgrade.o 00:03:55.715 CC lib/ftl/upgrade/ftl_band_upgrade.o 00:03:55.715 CC lib/ftl/upgrade/ftl_p2l_upgrade.o 00:03:55.715 CC lib/ftl/upgrade/ftl_chunk_upgrade.o 00:03:55.715 CC lib/ftl/upgrade/ftl_sb_upgrade.o 00:03:55.715 CC lib/ftl/upgrade/ftl_trim_upgrade.o 00:03:55.715 CC lib/ftl/upgrade/ftl_sb_v3.o 00:03:55.715 CC lib/ftl/upgrade/ftl_sb_v5.o 00:03:55.715 CC lib/ftl/nvc/ftl_nvc_bdev_vss.o 00:03:55.715 CC lib/ftl/nvc/ftl_nvc_dev.o 00:03:55.715 CC lib/ftl/nvc/ftl_nvc_bdev_non_vss.o 00:03:55.715 CC lib/ftl/nvc/ftl_nvc_bdev_common.o 00:03:55.715 CC lib/ftl/base/ftl_base_dev.o 00:03:55.715 CC lib/ftl/ftl_trace.o 00:03:55.715 CC lib/ftl/base/ftl_base_bdev.o 00:03:55.974 LIB libspdk_scsi.a 00:03:55.974 LIB libspdk_nbd.a 00:03:55.974 LIB libspdk_ublk.a 00:03:56.235 CC lib/iscsi/conn.o 00:03:56.235 CC lib/iscsi/init_grp.o 00:03:56.235 CC lib/iscsi/iscsi.o 00:03:56.235 CC lib/iscsi/portal_grp.o 00:03:56.235 CC lib/iscsi/param.o 00:03:56.235 CC lib/vhost/vhost.o 00:03:56.235 CC lib/iscsi/tgt_node.o 00:03:56.235 CC lib/vhost/vhost_rpc.o 00:03:56.235 CC lib/iscsi/iscsi_subsystem.o 00:03:56.235 CC lib/vhost/vhost_blk.o 00:03:56.235 CC lib/vhost/vhost_scsi.o 00:03:56.235 CC lib/iscsi/iscsi_rpc.o 00:03:56.235 CC lib/vhost/rte_vhost_user.o 00:03:56.235 CC lib/iscsi/task.o 00:03:56.235 LIB libspdk_ftl.a 00:03:56.804 LIB libspdk_nvmf.a 00:03:56.804 LIB libspdk_vhost.a 00:03:57.065 LIB libspdk_iscsi.a 00:03:57.325 CC module/env_dpdk/env_dpdk_rpc.o 00:03:57.585 CC module/vfu_device/vfu_virtio.o 00:03:57.585 CC module/vfu_device/vfu_virtio_blk.o 00:03:57.585 CC module/vfu_device/vfu_virtio_scsi.o 00:03:57.585 CC module/vfu_device/vfu_virtio_rpc.o 00:03:57.585 CC module/vfu_device/vfu_virtio_fs.o 00:03:57.585 LIB libspdk_env_dpdk_rpc.a 00:03:57.585 CC module/scheduler/gscheduler/gscheduler.o 00:03:57.585 CC module/blob/bdev/blob_bdev.o 00:03:57.585 CC module/accel/iaa/accel_iaa_rpc.o 00:03:57.585 CC module/accel/iaa/accel_iaa.o 00:03:57.585 CC module/keyring/file/keyring.o 00:03:57.585 CC module/keyring/linux/keyring_rpc.o 00:03:57.585 CC module/keyring/file/keyring_rpc.o 00:03:57.585 CC module/keyring/linux/keyring.o 00:03:57.585 CC module/sock/posix/posix.o 00:03:57.585 CC module/scheduler/dpdk_governor/dpdk_governor.o 00:03:57.585 CC module/accel/ioat/accel_ioat_rpc.o 00:03:57.585 CC module/scheduler/dynamic/scheduler_dynamic.o 00:03:57.585 CC module/accel/ioat/accel_ioat.o 00:03:57.585 CC module/accel/error/accel_error.o 00:03:57.585 CC module/accel/dsa/accel_dsa.o 00:03:57.585 CC module/accel/error/accel_error_rpc.o 00:03:57.585 CC module/accel/dsa/accel_dsa_rpc.o 00:03:57.585 CC module/fsdev/aio/fsdev_aio.o 00:03:57.585 CC module/fsdev/aio/fsdev_aio_rpc.o 00:03:57.585 CC module/fsdev/aio/linux_aio_mgr.o 00:03:57.845 LIB libspdk_scheduler_gscheduler.a 00:03:57.845 LIB libspdk_keyring_file.a 00:03:57.845 LIB libspdk_keyring_linux.a 00:03:57.845 LIB libspdk_scheduler_dpdk_governor.a 00:03:57.845 LIB libspdk_scheduler_dynamic.a 00:03:57.845 LIB libspdk_accel_iaa.a 00:03:57.845 LIB libspdk_accel_ioat.a 00:03:57.845 LIB libspdk_accel_error.a 00:03:57.845 LIB libspdk_blob_bdev.a 00:03:57.845 LIB libspdk_accel_dsa.a 00:03:57.845 LIB libspdk_vfu_device.a 00:03:58.104 LIB libspdk_sock_posix.a 00:03:58.104 LIB libspdk_fsdev_aio.a 00:03:58.104 CC module/bdev/zone_block/vbdev_zone_block.o 00:03:58.104 CC module/bdev/zone_block/vbdev_zone_block_rpc.o 00:03:58.104 CC module/bdev/lvol/vbdev_lvol.o 00:03:58.104 CC module/bdev/lvol/vbdev_lvol_rpc.o 00:03:58.104 CC module/bdev/delay/vbdev_delay.o 00:03:58.104 CC module/bdev/error/vbdev_error.o 00:03:58.104 CC module/bdev/delay/vbdev_delay_rpc.o 00:03:58.104 CC module/bdev/error/vbdev_error_rpc.o 00:03:58.104 CC module/blobfs/bdev/blobfs_bdev.o 00:03:58.364 CC module/bdev/null/bdev_null.o 00:03:58.364 CC module/blobfs/bdev/blobfs_bdev_rpc.o 00:03:58.364 CC module/bdev/null/bdev_null_rpc.o 00:03:58.364 CC module/bdev/raid/bdev_raid.o 00:03:58.364 CC module/bdev/passthru/vbdev_passthru.o 00:03:58.364 CC module/bdev/gpt/vbdev_gpt.o 00:03:58.364 CC module/bdev/gpt/gpt.o 00:03:58.364 CC module/bdev/raid/bdev_raid_rpc.o 00:03:58.364 CC module/bdev/passthru/vbdev_passthru_rpc.o 00:03:58.364 CC module/bdev/split/vbdev_split_rpc.o 00:03:58.364 CC module/bdev/split/vbdev_split.o 00:03:58.364 CC module/bdev/malloc/bdev_malloc_rpc.o 00:03:58.364 CC module/bdev/malloc/bdev_malloc.o 00:03:58.364 CC module/bdev/raid/bdev_raid_sb.o 00:03:58.364 CC module/bdev/iscsi/bdev_iscsi.o 00:03:58.364 CC module/bdev/raid/raid0.o 00:03:58.364 CC module/bdev/iscsi/bdev_iscsi_rpc.o 00:03:58.364 CC module/bdev/aio/bdev_aio.o 00:03:58.364 CC module/bdev/raid/raid1.o 00:03:58.364 CC module/bdev/virtio/bdev_virtio_scsi.o 00:03:58.364 CC module/bdev/nvme/bdev_nvme.o 00:03:58.364 CC module/bdev/raid/concat.o 00:03:58.364 CC module/bdev/aio/bdev_aio_rpc.o 00:03:58.364 CC module/bdev/virtio/bdev_virtio_blk.o 00:03:58.364 CC module/bdev/ftl/bdev_ftl_rpc.o 00:03:58.364 CC module/bdev/ftl/bdev_ftl.o 00:03:58.364 CC module/bdev/virtio/bdev_virtio_rpc.o 00:03:58.364 CC module/bdev/nvme/bdev_nvme_rpc.o 00:03:58.364 CC module/bdev/nvme/nvme_rpc.o 00:03:58.364 CC module/bdev/nvme/bdev_mdns_client.o 00:03:58.364 CC module/bdev/nvme/vbdev_opal.o 00:03:58.364 CC module/bdev/nvme/vbdev_opal_rpc.o 00:03:58.364 CC module/bdev/nvme/bdev_nvme_cuse_rpc.o 00:03:58.364 LIB libspdk_blobfs_bdev.a 00:03:58.364 LIB libspdk_bdev_split.a 00:03:58.364 LIB libspdk_bdev_gpt.a 00:03:58.364 LIB libspdk_bdev_error.a 00:03:58.364 LIB libspdk_bdev_null.a 00:03:58.364 LIB libspdk_bdev_ftl.a 00:03:58.364 LIB libspdk_bdev_zone_block.a 00:03:58.364 LIB libspdk_bdev_passthru.a 00:03:58.364 LIB libspdk_bdev_aio.a 00:03:58.624 LIB libspdk_bdev_iscsi.a 00:03:58.624 LIB libspdk_bdev_delay.a 00:03:58.624 LIB libspdk_bdev_malloc.a 00:03:58.624 LIB libspdk_bdev_lvol.a 00:03:58.624 LIB libspdk_bdev_virtio.a 00:03:58.884 LIB libspdk_bdev_raid.a 00:03:59.824 LIB libspdk_bdev_nvme.a 00:04:00.395 CC module/event/subsystems/iobuf/iobuf.o 00:04:00.395 CC module/event/subsystems/iobuf/iobuf_rpc.o 00:04:00.395 CC module/event/subsystems/keyring/keyring.o 00:04:00.395 CC module/event/subsystems/vmd/vmd_rpc.o 00:04:00.395 CC module/event/subsystems/vmd/vmd.o 00:04:00.395 CC module/event/subsystems/fsdev/fsdev.o 00:04:00.395 CC module/event/subsystems/vfu_tgt/vfu_tgt.o 00:04:00.395 CC module/event/subsystems/sock/sock.o 00:04:00.395 CC module/event/subsystems/vhost_blk/vhost_blk.o 00:04:00.395 CC module/event/subsystems/scheduler/scheduler.o 00:04:00.395 LIB libspdk_event_keyring.a 00:04:00.395 LIB libspdk_event_vmd.a 00:04:00.395 LIB libspdk_event_iobuf.a 00:04:00.395 LIB libspdk_event_fsdev.a 00:04:00.395 LIB libspdk_event_vfu_tgt.a 00:04:00.395 LIB libspdk_event_scheduler.a 00:04:00.395 LIB libspdk_event_vhost_blk.a 00:04:00.395 LIB libspdk_event_sock.a 00:04:00.655 CC module/event/subsystems/accel/accel.o 00:04:00.915 LIB libspdk_event_accel.a 00:04:01.176 CC module/event/subsystems/bdev/bdev.o 00:04:01.176 LIB libspdk_event_bdev.a 00:04:01.748 CC module/event/subsystems/ublk/ublk.o 00:04:01.748 CC module/event/subsystems/scsi/scsi.o 00:04:01.748 CC module/event/subsystems/nvmf/nvmf_rpc.o 00:04:01.748 CC module/event/subsystems/nvmf/nvmf_tgt.o 00:04:01.748 CC module/event/subsystems/nbd/nbd.o 00:04:01.748 LIB libspdk_event_ublk.a 00:04:01.748 LIB libspdk_event_nbd.a 00:04:01.748 LIB libspdk_event_scsi.a 00:04:01.748 LIB libspdk_event_nvmf.a 00:04:02.008 CC module/event/subsystems/vhost_scsi/vhost_scsi.o 00:04:02.008 CC module/event/subsystems/iscsi/iscsi.o 00:04:02.267 LIB libspdk_event_vhost_scsi.a 00:04:02.267 LIB libspdk_event_iscsi.a 00:04:02.531 CXX app/trace/trace.o 00:04:02.531 CC app/spdk_top/spdk_top.o 00:04:02.531 CC app/spdk_lspci/spdk_lspci.o 00:04:02.531 CC app/trace_record/trace_record.o 00:04:02.532 CC app/spdk_nvme_identify/identify.o 00:04:02.532 CC app/spdk_nvme_perf/perf.o 00:04:02.532 CC app/spdk_nvme_discover/discovery_aer.o 00:04:02.532 TEST_HEADER include/spdk/accel.h 00:04:02.532 TEST_HEADER include/spdk/accel_module.h 00:04:02.532 TEST_HEADER include/spdk/assert.h 00:04:02.532 TEST_HEADER include/spdk/barrier.h 00:04:02.532 TEST_HEADER include/spdk/bdev.h 00:04:02.532 TEST_HEADER include/spdk/base64.h 00:04:02.532 TEST_HEADER include/spdk/bdev_module.h 00:04:02.532 TEST_HEADER include/spdk/bdev_zone.h 00:04:02.532 TEST_HEADER include/spdk/bit_array.h 00:04:02.532 CC test/rpc_client/rpc_client_test.o 00:04:02.532 TEST_HEADER include/spdk/bit_pool.h 00:04:02.532 TEST_HEADER include/spdk/blob_bdev.h 00:04:02.532 TEST_HEADER include/spdk/blobfs_bdev.h 00:04:02.532 TEST_HEADER include/spdk/blobfs.h 00:04:02.532 CC examples/interrupt_tgt/interrupt_tgt.o 00:04:02.532 TEST_HEADER include/spdk/blob.h 00:04:02.532 TEST_HEADER include/spdk/conf.h 00:04:02.532 TEST_HEADER include/spdk/config.h 00:04:02.532 CC app/iscsi_tgt/iscsi_tgt.o 00:04:02.532 TEST_HEADER include/spdk/cpuset.h 00:04:02.532 TEST_HEADER include/spdk/crc16.h 00:04:02.532 TEST_HEADER include/spdk/crc32.h 00:04:02.532 TEST_HEADER include/spdk/dma.h 00:04:02.532 TEST_HEADER include/spdk/crc64.h 00:04:02.532 TEST_HEADER include/spdk/dif.h 00:04:02.532 TEST_HEADER include/spdk/env.h 00:04:02.532 TEST_HEADER include/spdk/endian.h 00:04:02.532 CC app/nvmf_tgt/nvmf_main.o 00:04:02.532 TEST_HEADER include/spdk/event.h 00:04:02.532 TEST_HEADER include/spdk/env_dpdk.h 00:04:02.532 TEST_HEADER include/spdk/fd_group.h 00:04:02.532 TEST_HEADER include/spdk/fsdev.h 00:04:02.532 TEST_HEADER include/spdk/fd.h 00:04:02.532 TEST_HEADER include/spdk/file.h 00:04:02.532 TEST_HEADER include/spdk/fuse_dispatcher.h 00:04:02.532 TEST_HEADER include/spdk/fsdev_module.h 00:04:02.532 TEST_HEADER include/spdk/ftl.h 00:04:02.532 TEST_HEADER include/spdk/histogram_data.h 00:04:02.532 TEST_HEADER include/spdk/gpt_spec.h 00:04:02.532 TEST_HEADER include/spdk/init.h 00:04:02.532 TEST_HEADER include/spdk/idxd_spec.h 00:04:02.532 TEST_HEADER include/spdk/hexlify.h 00:04:02.532 TEST_HEADER include/spdk/ioat.h 00:04:02.532 TEST_HEADER include/spdk/idxd.h 00:04:02.532 TEST_HEADER include/spdk/ioat_spec.h 00:04:02.532 TEST_HEADER include/spdk/json.h 00:04:02.532 TEST_HEADER include/spdk/jsonrpc.h 00:04:02.532 TEST_HEADER include/spdk/iscsi_spec.h 00:04:02.532 TEST_HEADER include/spdk/keyring_module.h 00:04:02.532 TEST_HEADER include/spdk/keyring.h 00:04:02.532 TEST_HEADER include/spdk/log.h 00:04:02.532 TEST_HEADER include/spdk/md5.h 00:04:02.532 TEST_HEADER include/spdk/lvol.h 00:04:02.532 TEST_HEADER include/spdk/likely.h 00:04:02.532 TEST_HEADER include/spdk/nbd.h 00:04:02.532 TEST_HEADER include/spdk/memory.h 00:04:02.532 TEST_HEADER include/spdk/net.h 00:04:02.532 TEST_HEADER include/spdk/mmio.h 00:04:02.532 CC app/spdk_dd/spdk_dd.o 00:04:02.532 TEST_HEADER include/spdk/notify.h 00:04:02.532 TEST_HEADER include/spdk/nvme_ocssd.h 00:04:02.532 TEST_HEADER include/spdk/nvme_ocssd_spec.h 00:04:02.532 TEST_HEADER include/spdk/nvme.h 00:04:02.532 TEST_HEADER include/spdk/nvme_intel.h 00:04:02.532 TEST_HEADER include/spdk/nvme_spec.h 00:04:02.532 TEST_HEADER include/spdk/nvmf_spec.h 00:04:02.532 TEST_HEADER include/spdk/nvmf_fc_spec.h 00:04:02.532 TEST_HEADER include/spdk/nvme_zns.h 00:04:02.532 TEST_HEADER include/spdk/nvmf.h 00:04:02.532 TEST_HEADER include/spdk/nvmf_cmd.h 00:04:02.532 TEST_HEADER include/spdk/opal.h 00:04:02.532 TEST_HEADER include/spdk/nvmf_transport.h 00:04:02.532 TEST_HEADER include/spdk/pipe.h 00:04:02.532 TEST_HEADER include/spdk/opal_spec.h 00:04:02.532 TEST_HEADER include/spdk/reduce.h 00:04:02.532 TEST_HEADER include/spdk/rpc.h 00:04:02.532 TEST_HEADER include/spdk/pci_ids.h 00:04:02.532 TEST_HEADER include/spdk/queue.h 00:04:02.532 TEST_HEADER include/spdk/scsi_spec.h 00:04:02.532 TEST_HEADER include/spdk/sock.h 00:04:02.532 TEST_HEADER include/spdk/scsi.h 00:04:02.532 TEST_HEADER include/spdk/stdinc.h 00:04:02.532 TEST_HEADER include/spdk/string.h 00:04:02.532 TEST_HEADER include/spdk/scheduler.h 00:04:02.532 TEST_HEADER include/spdk/thread.h 00:04:02.532 TEST_HEADER include/spdk/trace.h 00:04:02.532 TEST_HEADER include/spdk/tree.h 00:04:02.532 TEST_HEADER include/spdk/trace_parser.h 00:04:02.532 TEST_HEADER include/spdk/ublk.h 00:04:02.532 TEST_HEADER include/spdk/util.h 00:04:02.532 TEST_HEADER include/spdk/version.h 00:04:02.532 TEST_HEADER include/spdk/uuid.h 00:04:02.532 TEST_HEADER include/spdk/vfio_user_pci.h 00:04:02.532 TEST_HEADER include/spdk/vfio_user_spec.h 00:04:02.532 TEST_HEADER include/spdk/vhost.h 00:04:02.532 TEST_HEADER include/spdk/vmd.h 00:04:02.532 TEST_HEADER include/spdk/xor.h 00:04:02.532 CC app/spdk_tgt/spdk_tgt.o 00:04:02.532 CXX test/cpp_headers/accel.o 00:04:02.532 TEST_HEADER include/spdk/zipf.h 00:04:02.532 CXX test/cpp_headers/assert.o 00:04:02.532 CXX test/cpp_headers/accel_module.o 00:04:02.532 CXX test/cpp_headers/barrier.o 00:04:02.532 CXX test/cpp_headers/base64.o 00:04:02.532 CXX test/cpp_headers/bdev_module.o 00:04:02.532 CXX test/cpp_headers/bdev.o 00:04:02.532 CXX test/cpp_headers/bdev_zone.o 00:04:02.532 CXX test/cpp_headers/bit_pool.o 00:04:02.532 CXX test/cpp_headers/blob_bdev.o 00:04:02.532 CXX test/cpp_headers/bit_array.o 00:04:02.532 CXX test/cpp_headers/blobfs_bdev.o 00:04:02.532 CXX test/cpp_headers/blobfs.o 00:04:02.532 CXX test/cpp_headers/blob.o 00:04:02.532 CXX test/cpp_headers/cpuset.o 00:04:02.532 CXX test/cpp_headers/config.o 00:04:02.532 CXX test/cpp_headers/conf.o 00:04:02.532 CXX test/cpp_headers/crc32.o 00:04:02.532 CXX test/cpp_headers/crc16.o 00:04:02.532 CXX test/cpp_headers/crc64.o 00:04:02.532 CXX test/cpp_headers/dif.o 00:04:02.532 CXX test/cpp_headers/dma.o 00:04:02.532 CXX test/cpp_headers/endian.o 00:04:02.532 CXX test/cpp_headers/env.o 00:04:02.532 CXX test/cpp_headers/event.o 00:04:02.532 CXX test/cpp_headers/env_dpdk.o 00:04:02.532 CXX test/cpp_headers/file.o 00:04:02.532 CXX test/cpp_headers/fd_group.o 00:04:02.532 CXX test/cpp_headers/fd.o 00:04:02.532 CXX test/cpp_headers/fsdev.o 00:04:02.532 CXX test/cpp_headers/fsdev_module.o 00:04:02.532 CXX test/cpp_headers/ftl.o 00:04:02.532 CXX test/cpp_headers/fuse_dispatcher.o 00:04:02.532 CC examples/ioat/verify/verify.o 00:04:02.532 CXX test/cpp_headers/histogram_data.o 00:04:02.532 CC examples/ioat/perf/perf.o 00:04:02.532 CXX test/cpp_headers/idxd.o 00:04:02.532 CXX test/cpp_headers/hexlify.o 00:04:02.532 CXX test/cpp_headers/gpt_spec.o 00:04:02.532 CXX test/cpp_headers/init.o 00:04:02.532 CXX test/cpp_headers/idxd_spec.o 00:04:02.532 CXX test/cpp_headers/ioat.o 00:04:02.532 CXX test/cpp_headers/ioat_spec.o 00:04:02.532 CXX test/cpp_headers/json.o 00:04:02.532 CXX test/cpp_headers/iscsi_spec.o 00:04:02.532 CXX test/cpp_headers/jsonrpc.o 00:04:02.532 CXX test/cpp_headers/keyring.o 00:04:02.532 CXX test/cpp_headers/keyring_module.o 00:04:02.532 CXX test/cpp_headers/likely.o 00:04:02.532 CXX test/cpp_headers/log.o 00:04:02.532 CC test/thread/lock/spdk_lock.o 00:04:02.532 CXX test/cpp_headers/lvol.o 00:04:02.532 CXX test/cpp_headers/md5.o 00:04:02.532 CXX test/cpp_headers/memory.o 00:04:02.532 CXX test/cpp_headers/nbd.o 00:04:02.532 CXX test/cpp_headers/mmio.o 00:04:02.532 CC app/fio/nvme/fio_plugin.o 00:04:02.532 CXX test/cpp_headers/net.o 00:04:02.532 CXX test/cpp_headers/nvme.o 00:04:02.532 CXX test/cpp_headers/notify.o 00:04:02.532 CXX test/cpp_headers/nvme_intel.o 00:04:02.532 CC test/env/memory/memory_ut.o 00:04:02.532 CXX test/cpp_headers/nvme_ocssd_spec.o 00:04:02.532 CXX test/cpp_headers/nvme_ocssd.o 00:04:02.532 CXX test/cpp_headers/nvme_spec.o 00:04:02.532 CXX test/cpp_headers/nvme_zns.o 00:04:02.532 CC examples/util/zipf/zipf.o 00:04:02.532 CXX test/cpp_headers/nvmf_cmd.o 00:04:02.532 CXX test/cpp_headers/nvmf_fc_spec.o 00:04:02.532 CXX test/cpp_headers/nvmf.o 00:04:02.532 CXX test/cpp_headers/nvmf_spec.o 00:04:02.532 CXX test/cpp_headers/nvmf_transport.o 00:04:02.532 CXX test/cpp_headers/opal.o 00:04:02.532 CXX test/cpp_headers/opal_spec.o 00:04:02.533 CXX test/cpp_headers/pipe.o 00:04:02.533 CXX test/cpp_headers/pci_ids.o 00:04:02.533 CXX test/cpp_headers/queue.o 00:04:02.533 CC test/app/jsoncat/jsoncat.o 00:04:02.533 CXX test/cpp_headers/reduce.o 00:04:02.533 CXX test/cpp_headers/rpc.o 00:04:02.533 CXX test/cpp_headers/scsi.o 00:04:02.533 LINK spdk_lspci 00:04:02.533 CXX test/cpp_headers/scheduler.o 00:04:02.533 CXX test/cpp_headers/scsi_spec.o 00:04:02.533 CC test/thread/poller_perf/poller_perf.o 00:04:02.533 CXX test/cpp_headers/stdinc.o 00:04:02.533 CXX test/cpp_headers/sock.o 00:04:02.533 CXX test/cpp_headers/string.o 00:04:02.533 CC test/env/env_dpdk_post_init/env_dpdk_post_init.o 00:04:02.533 CC test/app/histogram_perf/histogram_perf.o 00:04:02.533 CC test/env/vtophys/vtophys.o 00:04:02.533 CXX test/cpp_headers/thread.o 00:04:02.533 CC test/env/pci/pci_ut.o 00:04:02.533 CC test/app/stub/stub.o 00:04:02.792 CC test/dma/test_dma/test_dma.o 00:04:02.792 CC app/fio/bdev/fio_plugin.o 00:04:02.792 LINK spdk_nvme_discover 00:04:02.792 LINK rpc_client_test 00:04:02.792 CC test/app/bdev_svc/bdev_svc.o 00:04:02.792 CC test/env/mem_callbacks/mem_callbacks.o 00:04:02.792 LINK interrupt_tgt 00:04:02.792 CC test/app/fuzz/nvme_fuzz/nvme_fuzz.o 00:04:02.792 LINK spdk_trace_record 00:04:02.792 CC test/app/fuzz/iscsi_fuzz/iscsi_fuzz.o 00:04:02.792 LINK nvmf_tgt 00:04:02.792 CXX test/cpp_headers/trace.o 00:04:02.792 CXX test/cpp_headers/trace_parser.o 00:04:02.792 LINK iscsi_tgt 00:04:02.792 CXX test/cpp_headers/tree.o 00:04:02.792 CXX test/cpp_headers/ublk.o 00:04:02.792 CXX test/cpp_headers/util.o 00:04:02.792 CXX test/cpp_headers/uuid.o 00:04:02.792 CC test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.o 00:04:02.792 CC test/app/fuzz/vhost_fuzz/vhost_fuzz_rpc.o 00:04:02.792 CXX test/cpp_headers/version.o 00:04:02.792 LINK jsoncat 00:04:02.792 CXX test/cpp_headers/vfio_user_pci.o 00:04:02.792 CXX test/cpp_headers/vfio_user_spec.o 00:04:02.792 LINK zipf 00:04:02.792 CXX test/cpp_headers/vhost.o 00:04:02.792 CXX test/cpp_headers/vmd.o 00:04:02.792 CC test/app/fuzz/vhost_fuzz/vhost_fuzz.o 00:04:02.792 CXX test/cpp_headers/xor.o 00:04:02.792 CXX test/cpp_headers/zipf.o 00:04:02.792 LINK poller_perf 00:04:02.792 LINK vtophys 00:04:02.792 LINK histogram_perf 00:04:02.792 LINK env_dpdk_post_init 00:04:02.792 LINK verify 00:04:02.792 LINK ioat_perf 00:04:02.792 CC test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.o 00:04:02.792 LINK stub 00:04:03.051 LINK spdk_tgt 00:04:03.051 LINK spdk_trace 00:04:03.051 LINK bdev_svc 00:04:03.051 LINK spdk_dd 00:04:03.051 LINK pci_ut 00:04:03.051 LINK vhost_fuzz 00:04:03.051 LINK llvm_vfio_fuzz 00:04:03.051 LINK test_dma 00:04:03.051 LINK nvme_fuzz 00:04:03.310 LINK spdk_nvme_identify 00:04:03.310 LINK spdk_nvme 00:04:03.310 LINK spdk_top 00:04:03.310 LINK mem_callbacks 00:04:03.310 LINK spdk_bdev 00:04:03.310 LINK spdk_nvme_perf 00:04:03.310 LINK llvm_nvme_fuzz 00:04:03.568 CC app/vhost/vhost.o 00:04:03.568 CC examples/vmd/led/led.o 00:04:03.568 CC examples/vmd/lsvmd/lsvmd.o 00:04:03.568 CC examples/sock/hello_world/hello_sock.o 00:04:03.568 CC examples/idxd/perf/perf.o 00:04:03.568 CC examples/thread/thread/thread_ex.o 00:04:03.568 LINK led 00:04:03.568 LINK lsvmd 00:04:03.568 LINK vhost 00:04:03.568 LINK memory_ut 00:04:03.568 LINK hello_sock 00:04:03.828 LINK idxd_perf 00:04:03.828 LINK spdk_lock 00:04:03.828 LINK thread 00:04:03.828 LINK iscsi_fuzz 00:04:04.398 CC examples/nvme/hotplug/hotplug.o 00:04:04.398 CC examples/nvme/reconnect/reconnect.o 00:04:04.398 CC examples/nvme/cmb_copy/cmb_copy.o 00:04:04.398 CC examples/nvme/pmr_persistence/pmr_persistence.o 00:04:04.398 CC examples/nvme/hello_world/hello_world.o 00:04:04.398 CC examples/nvme/arbitration/arbitration.o 00:04:04.398 CC examples/nvme/nvme_manage/nvme_manage.o 00:04:04.398 CC examples/nvme/abort/abort.o 00:04:04.398 CC test/event/reactor_perf/reactor_perf.o 00:04:04.398 CC test/event/reactor/reactor.o 00:04:04.398 CC test/event/event_perf/event_perf.o 00:04:04.398 CC test/event/app_repeat/app_repeat.o 00:04:04.398 CC test/event/scheduler/scheduler.o 00:04:04.398 LINK pmr_persistence 00:04:04.657 LINK reactor_perf 00:04:04.657 LINK hotplug 00:04:04.657 LINK cmb_copy 00:04:04.657 LINK event_perf 00:04:04.657 LINK reactor 00:04:04.657 LINK hello_world 00:04:04.657 LINK app_repeat 00:04:04.657 LINK reconnect 00:04:04.657 LINK abort 00:04:04.657 LINK arbitration 00:04:04.657 LINK scheduler 00:04:04.657 LINK nvme_manage 00:04:04.916 CC test/nvme/e2edp/nvme_dp.o 00:04:04.916 CC test/nvme/aer/aer.o 00:04:04.916 CC test/nvme/simple_copy/simple_copy.o 00:04:04.916 CC test/nvme/sgl/sgl.o 00:04:04.916 CC test/nvme/err_injection/err_injection.o 00:04:04.916 CC test/nvme/fdp/fdp.o 00:04:04.916 CC test/nvme/reserve/reserve.o 00:04:04.916 CC test/nvme/cuse/cuse.o 00:04:04.916 CC test/nvme/doorbell_aers/doorbell_aers.o 00:04:04.916 CC test/nvme/reset/reset.o 00:04:04.916 CC test/nvme/overhead/overhead.o 00:04:04.916 CC test/nvme/fused_ordering/fused_ordering.o 00:04:04.916 CC test/nvme/boot_partition/boot_partition.o 00:04:04.916 CC test/nvme/connect_stress/connect_stress.o 00:04:04.916 CC test/nvme/startup/startup.o 00:04:04.916 CC test/accel/dif/dif.o 00:04:04.916 CC test/nvme/compliance/nvme_compliance.o 00:04:04.916 CC test/blobfs/mkfs/mkfs.o 00:04:04.916 CC test/lvol/esnap/esnap.o 00:04:04.916 LINK doorbell_aers 00:04:04.916 LINK startup 00:04:04.916 LINK reserve 00:04:04.916 LINK err_injection 00:04:04.916 LINK connect_stress 00:04:04.916 LINK fused_ordering 00:04:04.916 LINK boot_partition 00:04:04.916 LINK simple_copy 00:04:04.916 LINK nvme_dp 00:04:04.916 LINK aer 00:04:04.916 LINK reset 00:04:04.916 LINK fdp 00:04:04.916 LINK sgl 00:04:04.916 LINK overhead 00:04:05.176 LINK mkfs 00:04:05.176 LINK nvme_compliance 00:04:05.176 LINK dif 00:04:05.436 CC examples/accel/perf/accel_perf.o 00:04:05.436 CC examples/fsdev/hello_world/hello_fsdev.o 00:04:05.436 CC examples/blob/hello_world/hello_blob.o 00:04:05.436 CC examples/blob/cli/blobcli.o 00:04:05.695 LINK hello_blob 00:04:05.695 LINK cuse 00:04:05.695 LINK hello_fsdev 00:04:05.695 LINK accel_perf 00:04:05.695 LINK blobcli 00:04:06.635 CC examples/bdev/bdevperf/bdevperf.o 00:04:06.635 CC examples/bdev/hello_world/hello_bdev.o 00:04:06.635 LINK hello_bdev 00:04:06.895 CC test/bdev/bdevio/bdevio.o 00:04:06.895 LINK bdevperf 00:04:07.154 LINK bdevio 00:04:08.094 LINK esnap 00:04:08.667 CC examples/nvmf/nvmf/nvmf.o 00:04:08.667 LINK nvmf 00:04:10.049 00:04:10.049 real 0m36.690s 00:04:10.049 user 4m40.163s 00:04:10.049 sys 1m45.069s 00:04:10.049 14:17:05 make -- common/autotest_common.sh@1130 -- $ xtrace_disable 00:04:10.049 14:17:05 make -- common/autotest_common.sh@10 -- $ set +x 00:04:10.049 ************************************ 00:04:10.049 END TEST make 00:04:10.049 ************************************ 00:04:10.049 14:17:05 -- spdk/autobuild.sh@1 -- $ stop_monitor_resources 00:04:10.049 14:17:05 -- pm/common@29 -- $ signal_monitor_resources TERM 00:04:10.049 14:17:05 -- pm/common@40 -- $ local monitor pid pids signal=TERM 00:04:10.049 14:17:05 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:04:10.049 14:17:05 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-cpu-load.pid ]] 00:04:10.049 14:17:05 -- pm/common@44 -- $ pid=180038 00:04:10.049 14:17:05 -- pm/common@50 -- $ kill -TERM 180038 00:04:10.049 14:17:05 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:04:10.049 14:17:05 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-vmstat.pid ]] 00:04:10.049 14:17:05 -- pm/common@44 -- $ pid=180040 00:04:10.049 14:17:05 -- pm/common@50 -- $ kill -TERM 180040 00:04:10.049 14:17:05 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:04:10.049 14:17:05 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-cpu-temp.pid ]] 00:04:10.049 14:17:05 -- pm/common@44 -- $ pid=180042 00:04:10.049 14:17:05 -- pm/common@50 -- $ kill -TERM 180042 00:04:10.049 14:17:05 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:04:10.049 14:17:05 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-bmc-pm.pid ]] 00:04:10.049 14:17:05 -- pm/common@44 -- $ pid=180065 00:04:10.049 14:17:05 -- pm/common@50 -- $ sudo -E kill -TERM 180065 00:04:10.049 14:17:06 -- spdk/autorun.sh@26 -- $ (( SPDK_TEST_UNITTEST == 1 || SPDK_RUN_FUNCTIONAL_TEST == 1 )) 00:04:10.049 14:17:06 -- spdk/autorun.sh@27 -- $ sudo -E /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/autotest.sh /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:04:10.049 14:17:06 -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:04:10.049 14:17:06 -- common/autotest_common.sh@1693 -- # lcov --version 00:04:10.049 14:17:06 -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:04:10.309 14:17:06 -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:04:10.309 14:17:06 -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:04:10.309 14:17:06 -- scripts/common.sh@333 -- # local ver1 ver1_l 00:04:10.309 14:17:06 -- scripts/common.sh@334 -- # local ver2 ver2_l 00:04:10.309 14:17:06 -- scripts/common.sh@336 -- # IFS=.-: 00:04:10.309 14:17:06 -- scripts/common.sh@336 -- # read -ra ver1 00:04:10.309 14:17:06 -- scripts/common.sh@337 -- # IFS=.-: 00:04:10.309 14:17:06 -- scripts/common.sh@337 -- # read -ra ver2 00:04:10.309 14:17:06 -- scripts/common.sh@338 -- # local 'op=<' 00:04:10.309 14:17:06 -- scripts/common.sh@340 -- # ver1_l=2 00:04:10.309 14:17:06 -- scripts/common.sh@341 -- # ver2_l=1 00:04:10.309 14:17:06 -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:04:10.309 14:17:06 -- scripts/common.sh@344 -- # case "$op" in 00:04:10.309 14:17:06 -- scripts/common.sh@345 -- # : 1 00:04:10.309 14:17:06 -- scripts/common.sh@364 -- # (( v = 0 )) 00:04:10.309 14:17:06 -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:04:10.309 14:17:06 -- scripts/common.sh@365 -- # decimal 1 00:04:10.309 14:17:06 -- scripts/common.sh@353 -- # local d=1 00:04:10.309 14:17:06 -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:04:10.309 14:17:06 -- scripts/common.sh@355 -- # echo 1 00:04:10.309 14:17:06 -- scripts/common.sh@365 -- # ver1[v]=1 00:04:10.309 14:17:06 -- scripts/common.sh@366 -- # decimal 2 00:04:10.309 14:17:06 -- scripts/common.sh@353 -- # local d=2 00:04:10.309 14:17:06 -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:04:10.309 14:17:06 -- scripts/common.sh@355 -- # echo 2 00:04:10.309 14:17:06 -- scripts/common.sh@366 -- # ver2[v]=2 00:04:10.309 14:17:06 -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:04:10.309 14:17:06 -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:04:10.309 14:17:06 -- scripts/common.sh@368 -- # return 0 00:04:10.309 14:17:06 -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:04:10.309 14:17:06 -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:04:10.309 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:10.309 --rc genhtml_branch_coverage=1 00:04:10.309 --rc genhtml_function_coverage=1 00:04:10.309 --rc genhtml_legend=1 00:04:10.309 --rc geninfo_all_blocks=1 00:04:10.309 --rc geninfo_unexecuted_blocks=1 00:04:10.309 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:10.309 ' 00:04:10.309 14:17:06 -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:04:10.309 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:10.309 --rc genhtml_branch_coverage=1 00:04:10.309 --rc genhtml_function_coverage=1 00:04:10.309 --rc genhtml_legend=1 00:04:10.309 --rc geninfo_all_blocks=1 00:04:10.309 --rc geninfo_unexecuted_blocks=1 00:04:10.309 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:10.309 ' 00:04:10.309 14:17:06 -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:04:10.309 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:10.309 --rc genhtml_branch_coverage=1 00:04:10.309 --rc genhtml_function_coverage=1 00:04:10.309 --rc genhtml_legend=1 00:04:10.309 --rc geninfo_all_blocks=1 00:04:10.309 --rc geninfo_unexecuted_blocks=1 00:04:10.309 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:10.309 ' 00:04:10.309 14:17:06 -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:04:10.309 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:10.309 --rc genhtml_branch_coverage=1 00:04:10.309 --rc genhtml_function_coverage=1 00:04:10.309 --rc genhtml_legend=1 00:04:10.309 --rc geninfo_all_blocks=1 00:04:10.309 --rc geninfo_unexecuted_blocks=1 00:04:10.309 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:10.309 ' 00:04:10.309 14:17:06 -- spdk/autotest.sh@25 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:04:10.309 14:17:06 -- nvmf/common.sh@7 -- # uname -s 00:04:10.309 14:17:06 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:04:10.309 14:17:06 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:04:10.309 14:17:06 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:04:10.309 14:17:06 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:04:10.309 14:17:06 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:04:10.309 14:17:06 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:04:10.309 14:17:06 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:04:10.309 14:17:06 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:04:10.309 14:17:06 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:04:10.309 14:17:06 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:04:10.309 14:17:06 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809b5fbc-4be7-e711-906e-0017a4403562 00:04:10.309 14:17:06 -- nvmf/common.sh@18 -- # NVME_HOSTID=809b5fbc-4be7-e711-906e-0017a4403562 00:04:10.309 14:17:06 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:04:10.309 14:17:06 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:04:10.309 14:17:06 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:04:10.309 14:17:06 -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:04:10.309 14:17:06 -- nvmf/common.sh@49 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:04:10.309 14:17:06 -- scripts/common.sh@15 -- # shopt -s extglob 00:04:10.309 14:17:06 -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:04:10.309 14:17:06 -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:04:10.309 14:17:06 -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:04:10.309 14:17:06 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:04:10.309 14:17:06 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:04:10.309 14:17:06 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:04:10.309 14:17:06 -- paths/export.sh@5 -- # export PATH 00:04:10.309 14:17:06 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:04:10.309 14:17:06 -- nvmf/common.sh@51 -- # : 0 00:04:10.309 14:17:06 -- nvmf/common.sh@52 -- # export NVMF_APP_SHM_ID 00:04:10.309 14:17:06 -- nvmf/common.sh@53 -- # build_nvmf_app_args 00:04:10.309 14:17:06 -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:04:10.309 14:17:06 -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:04:10.309 14:17:06 -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:04:10.309 14:17:06 -- nvmf/common.sh@33 -- # '[' '' -eq 1 ']' 00:04:10.309 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh: line 33: [: : integer expression expected 00:04:10.309 14:17:06 -- nvmf/common.sh@37 -- # '[' -n '' ']' 00:04:10.309 14:17:06 -- nvmf/common.sh@39 -- # '[' 0 -eq 1 ']' 00:04:10.309 14:17:06 -- nvmf/common.sh@55 -- # have_pci_nics=0 00:04:10.309 14:17:06 -- spdk/autotest.sh@27 -- # '[' 0 -ne 0 ']' 00:04:10.309 14:17:06 -- spdk/autotest.sh@32 -- # uname -s 00:04:10.309 14:17:06 -- spdk/autotest.sh@32 -- # '[' Linux = Linux ']' 00:04:10.309 14:17:06 -- spdk/autotest.sh@33 -- # old_core_pattern='|/usr/lib/systemd/systemd-coredump %P %u %g %s %t %c %h' 00:04:10.309 14:17:06 -- spdk/autotest.sh@34 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/coredumps 00:04:10.309 14:17:06 -- spdk/autotest.sh@39 -- # echo '|/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/core-collector.sh %P %s %t' 00:04:10.309 14:17:06 -- spdk/autotest.sh@40 -- # echo /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/coredumps 00:04:10.309 14:17:06 -- spdk/autotest.sh@44 -- # modprobe nbd 00:04:10.309 14:17:06 -- spdk/autotest.sh@46 -- # type -P udevadm 00:04:10.309 14:17:06 -- spdk/autotest.sh@46 -- # udevadm=/usr/sbin/udevadm 00:04:10.309 14:17:06 -- spdk/autotest.sh@48 -- # udevadm_pid=259129 00:04:10.309 14:17:06 -- spdk/autotest.sh@47 -- # /usr/sbin/udevadm monitor --property 00:04:10.309 14:17:06 -- spdk/autotest.sh@53 -- # start_monitor_resources 00:04:10.309 14:17:06 -- pm/common@17 -- # local monitor 00:04:10.309 14:17:06 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:04:10.309 14:17:06 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:04:10.309 14:17:06 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:04:10.309 14:17:06 -- pm/common@21 -- # date +%s 00:04:10.309 14:17:06 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:04:10.309 14:17:06 -- pm/common@21 -- # date +%s 00:04:10.309 14:17:06 -- pm/common@25 -- # sleep 1 00:04:10.309 14:17:06 -- pm/common@21 -- # date +%s 00:04:10.309 14:17:06 -- pm/common@21 -- # date +%s 00:04:10.310 14:17:06 -- pm/common@21 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1731935826 00:04:10.310 14:17:06 -- pm/common@21 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1731935826 00:04:10.310 14:17:06 -- pm/common@21 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1731935826 00:04:10.310 14:17:06 -- pm/common@21 -- # sudo -E /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1731935826 00:04:10.310 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autotest.sh.1731935826_collect-cpu-load.pm.log 00:04:10.310 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autotest.sh.1731935826_collect-vmstat.pm.log 00:04:10.310 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autotest.sh.1731935826_collect-cpu-temp.pm.log 00:04:10.310 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autotest.sh.1731935826_collect-bmc-pm.bmc.pm.log 00:04:11.250 14:17:07 -- spdk/autotest.sh@55 -- # trap 'autotest_cleanup || :; exit 1' SIGINT SIGTERM EXIT 00:04:11.250 14:17:07 -- spdk/autotest.sh@57 -- # timing_enter autotest 00:04:11.250 14:17:07 -- common/autotest_common.sh@726 -- # xtrace_disable 00:04:11.250 14:17:07 -- common/autotest_common.sh@10 -- # set +x 00:04:11.250 14:17:07 -- spdk/autotest.sh@59 -- # create_test_list 00:04:11.250 14:17:07 -- common/autotest_common.sh@752 -- # xtrace_disable 00:04:11.250 14:17:07 -- common/autotest_common.sh@10 -- # set +x 00:04:11.250 14:17:07 -- spdk/autotest.sh@61 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/autotest.sh 00:04:11.250 14:17:07 -- spdk/autotest.sh@61 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:04:11.250 14:17:07 -- spdk/autotest.sh@61 -- # src=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:04:11.250 14:17:07 -- spdk/autotest.sh@62 -- # out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:04:11.250 14:17:07 -- spdk/autotest.sh@63 -- # cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:04:11.250 14:17:07 -- spdk/autotest.sh@65 -- # freebsd_update_contigmem_mod 00:04:11.250 14:17:07 -- common/autotest_common.sh@1457 -- # uname 00:04:11.250 14:17:07 -- common/autotest_common.sh@1457 -- # '[' Linux = FreeBSD ']' 00:04:11.250 14:17:07 -- spdk/autotest.sh@66 -- # freebsd_set_maxsock_buf 00:04:11.250 14:17:07 -- common/autotest_common.sh@1477 -- # uname 00:04:11.510 14:17:07 -- common/autotest_common.sh@1477 -- # [[ Linux = FreeBSD ]] 00:04:11.510 14:17:07 -- spdk/autotest.sh@68 -- # [[ y == y ]] 00:04:11.510 14:17:07 -- spdk/autotest.sh@70 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh --version 00:04:11.510 lcov: LCOV version 1.15 00:04:11.510 14:17:07 -- spdk/autotest.sh@72 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -c --no-external -i -t Baseline -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_base.info 00:04:16.792 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/mdns_server.gcno 00:04:22.076 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_stubs.gcno 00:04:27.355 14:17:23 -- spdk/autotest.sh@76 -- # timing_enter pre_cleanup 00:04:27.355 14:17:23 -- common/autotest_common.sh@726 -- # xtrace_disable 00:04:27.355 14:17:23 -- common/autotest_common.sh@10 -- # set +x 00:04:27.355 14:17:23 -- spdk/autotest.sh@78 -- # rm -f 00:04:27.355 14:17:23 -- spdk/autotest.sh@81 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:30.650 0000:00:04.7 (8086 2021): Already using the ioatdma driver 00:04:30.650 0000:00:04.6 (8086 2021): Already using the ioatdma driver 00:04:30.650 0000:00:04.5 (8086 2021): Already using the ioatdma driver 00:04:30.650 0000:00:04.4 (8086 2021): Already using the ioatdma driver 00:04:30.650 0000:00:04.3 (8086 2021): Already using the ioatdma driver 00:04:30.650 0000:00:04.2 (8086 2021): Already using the ioatdma driver 00:04:30.650 0000:00:04.1 (8086 2021): Already using the ioatdma driver 00:04:30.650 0000:00:04.0 (8086 2021): Already using the ioatdma driver 00:04:30.650 0000:80:04.7 (8086 2021): Already using the ioatdma driver 00:04:30.650 0000:80:04.6 (8086 2021): Already using the ioatdma driver 00:04:30.650 0000:80:04.5 (8086 2021): Already using the ioatdma driver 00:04:30.650 0000:80:04.4 (8086 2021): Already using the ioatdma driver 00:04:30.650 0000:80:04.3 (8086 2021): Already using the ioatdma driver 00:04:30.650 0000:80:04.2 (8086 2021): Already using the ioatdma driver 00:04:30.909 0000:80:04.1 (8086 2021): Already using the ioatdma driver 00:04:30.909 0000:80:04.0 (8086 2021): Already using the ioatdma driver 00:04:30.909 0000:d8:00.0 (8086 0a54): Already using the nvme driver 00:04:30.909 14:17:26 -- spdk/autotest.sh@83 -- # get_zoned_devs 00:04:30.909 14:17:26 -- common/autotest_common.sh@1657 -- # zoned_devs=() 00:04:30.909 14:17:26 -- common/autotest_common.sh@1657 -- # local -gA zoned_devs 00:04:30.909 14:17:26 -- common/autotest_common.sh@1658 -- # local nvme bdf 00:04:30.909 14:17:26 -- common/autotest_common.sh@1660 -- # for nvme in /sys/block/nvme* 00:04:30.909 14:17:26 -- common/autotest_common.sh@1661 -- # is_block_zoned nvme0n1 00:04:30.909 14:17:26 -- common/autotest_common.sh@1650 -- # local device=nvme0n1 00:04:30.909 14:17:26 -- common/autotest_common.sh@1652 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:04:30.909 14:17:26 -- common/autotest_common.sh@1653 -- # [[ none != none ]] 00:04:30.909 14:17:26 -- spdk/autotest.sh@85 -- # (( 0 > 0 )) 00:04:30.909 14:17:26 -- spdk/autotest.sh@97 -- # for dev in /dev/nvme*n!(*p*) 00:04:30.909 14:17:26 -- spdk/autotest.sh@99 -- # [[ -z '' ]] 00:04:30.909 14:17:26 -- spdk/autotest.sh@100 -- # block_in_use /dev/nvme0n1 00:04:30.909 14:17:26 -- scripts/common.sh@381 -- # local block=/dev/nvme0n1 pt 00:04:30.909 14:17:26 -- scripts/common.sh@390 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/spdk-gpt.py /dev/nvme0n1 00:04:30.909 No valid GPT data, bailing 00:04:30.909 14:17:26 -- scripts/common.sh@394 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:04:30.909 14:17:26 -- scripts/common.sh@394 -- # pt= 00:04:30.909 14:17:26 -- scripts/common.sh@395 -- # return 1 00:04:30.909 14:17:26 -- spdk/autotest.sh@101 -- # dd if=/dev/zero of=/dev/nvme0n1 bs=1M count=1 00:04:30.909 1+0 records in 00:04:30.909 1+0 records out 00:04:30.909 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00483474 s, 217 MB/s 00:04:30.909 14:17:26 -- spdk/autotest.sh@105 -- # sync 00:04:30.909 14:17:26 -- spdk/autotest.sh@107 -- # xtrace_disable_per_cmd reap_spdk_processes 00:04:30.909 14:17:26 -- common/autotest_common.sh@22 -- # eval 'reap_spdk_processes 12> /dev/null' 00:04:30.909 14:17:26 -- common/autotest_common.sh@22 -- # reap_spdk_processes 00:04:39.052 14:17:34 -- spdk/autotest.sh@111 -- # uname -s 00:04:39.052 14:17:34 -- spdk/autotest.sh@111 -- # [[ Linux == Linux ]] 00:04:39.052 14:17:34 -- spdk/autotest.sh@111 -- # [[ 1 -eq 1 ]] 00:04:39.052 14:17:34 -- spdk/autotest.sh@112 -- # run_test setup.sh /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/test-setup.sh 00:04:39.052 14:17:34 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:04:39.052 14:17:34 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:04:39.052 14:17:34 -- common/autotest_common.sh@10 -- # set +x 00:04:39.052 ************************************ 00:04:39.052 START TEST setup.sh 00:04:39.052 ************************************ 00:04:39.052 14:17:34 setup.sh -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/test-setup.sh 00:04:39.052 * Looking for test storage... 00:04:39.052 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:04:39.052 14:17:34 setup.sh -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:04:39.052 14:17:34 setup.sh -- common/autotest_common.sh@1693 -- # lcov --version 00:04:39.052 14:17:34 setup.sh -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:04:39.052 14:17:34 setup.sh -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:04:39.052 14:17:34 setup.sh -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:04:39.052 14:17:34 setup.sh -- scripts/common.sh@333 -- # local ver1 ver1_l 00:04:39.052 14:17:34 setup.sh -- scripts/common.sh@334 -- # local ver2 ver2_l 00:04:39.052 14:17:34 setup.sh -- scripts/common.sh@336 -- # IFS=.-: 00:04:39.052 14:17:34 setup.sh -- scripts/common.sh@336 -- # read -ra ver1 00:04:39.052 14:17:34 setup.sh -- scripts/common.sh@337 -- # IFS=.-: 00:04:39.052 14:17:34 setup.sh -- scripts/common.sh@337 -- # read -ra ver2 00:04:39.052 14:17:34 setup.sh -- scripts/common.sh@338 -- # local 'op=<' 00:04:39.052 14:17:34 setup.sh -- scripts/common.sh@340 -- # ver1_l=2 00:04:39.052 14:17:34 setup.sh -- scripts/common.sh@341 -- # ver2_l=1 00:04:39.052 14:17:34 setup.sh -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:04:39.052 14:17:34 setup.sh -- scripts/common.sh@344 -- # case "$op" in 00:04:39.052 14:17:34 setup.sh -- scripts/common.sh@345 -- # : 1 00:04:39.052 14:17:34 setup.sh -- scripts/common.sh@364 -- # (( v = 0 )) 00:04:39.052 14:17:34 setup.sh -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:04:39.052 14:17:34 setup.sh -- scripts/common.sh@365 -- # decimal 1 00:04:39.052 14:17:34 setup.sh -- scripts/common.sh@353 -- # local d=1 00:04:39.052 14:17:34 setup.sh -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:04:39.052 14:17:34 setup.sh -- scripts/common.sh@355 -- # echo 1 00:04:39.052 14:17:34 setup.sh -- scripts/common.sh@365 -- # ver1[v]=1 00:04:39.052 14:17:34 setup.sh -- scripts/common.sh@366 -- # decimal 2 00:04:39.052 14:17:34 setup.sh -- scripts/common.sh@353 -- # local d=2 00:04:39.052 14:17:34 setup.sh -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:04:39.052 14:17:34 setup.sh -- scripts/common.sh@355 -- # echo 2 00:04:39.052 14:17:34 setup.sh -- scripts/common.sh@366 -- # ver2[v]=2 00:04:39.052 14:17:34 setup.sh -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:04:39.052 14:17:34 setup.sh -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:04:39.052 14:17:34 setup.sh -- scripts/common.sh@368 -- # return 0 00:04:39.052 14:17:34 setup.sh -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:04:39.052 14:17:34 setup.sh -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:04:39.052 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:39.052 --rc genhtml_branch_coverage=1 00:04:39.052 --rc genhtml_function_coverage=1 00:04:39.052 --rc genhtml_legend=1 00:04:39.052 --rc geninfo_all_blocks=1 00:04:39.052 --rc geninfo_unexecuted_blocks=1 00:04:39.052 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:39.052 ' 00:04:39.052 14:17:34 setup.sh -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:04:39.052 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:39.052 --rc genhtml_branch_coverage=1 00:04:39.052 --rc genhtml_function_coverage=1 00:04:39.052 --rc genhtml_legend=1 00:04:39.052 --rc geninfo_all_blocks=1 00:04:39.052 --rc geninfo_unexecuted_blocks=1 00:04:39.052 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:39.052 ' 00:04:39.052 14:17:34 setup.sh -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:04:39.052 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:39.052 --rc genhtml_branch_coverage=1 00:04:39.052 --rc genhtml_function_coverage=1 00:04:39.052 --rc genhtml_legend=1 00:04:39.052 --rc geninfo_all_blocks=1 00:04:39.052 --rc geninfo_unexecuted_blocks=1 00:04:39.052 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:39.052 ' 00:04:39.052 14:17:34 setup.sh -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:04:39.052 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:39.052 --rc genhtml_branch_coverage=1 00:04:39.052 --rc genhtml_function_coverage=1 00:04:39.052 --rc genhtml_legend=1 00:04:39.052 --rc geninfo_all_blocks=1 00:04:39.052 --rc geninfo_unexecuted_blocks=1 00:04:39.052 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:39.052 ' 00:04:39.052 14:17:34 setup.sh -- setup/test-setup.sh@10 -- # uname -s 00:04:39.052 14:17:34 setup.sh -- setup/test-setup.sh@10 -- # [[ Linux == Linux ]] 00:04:39.052 14:17:34 setup.sh -- setup/test-setup.sh@12 -- # run_test acl /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/acl.sh 00:04:39.052 14:17:34 setup.sh -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:04:39.052 14:17:34 setup.sh -- common/autotest_common.sh@1111 -- # xtrace_disable 00:04:39.052 14:17:34 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:04:39.052 ************************************ 00:04:39.052 START TEST acl 00:04:39.052 ************************************ 00:04:39.052 14:17:34 setup.sh.acl -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/acl.sh 00:04:39.052 * Looking for test storage... 00:04:39.053 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:04:39.053 14:17:34 setup.sh.acl -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:04:39.053 14:17:34 setup.sh.acl -- common/autotest_common.sh@1693 -- # lcov --version 00:04:39.053 14:17:34 setup.sh.acl -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:04:39.053 14:17:34 setup.sh.acl -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:04:39.053 14:17:34 setup.sh.acl -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:04:39.053 14:17:34 setup.sh.acl -- scripts/common.sh@333 -- # local ver1 ver1_l 00:04:39.053 14:17:34 setup.sh.acl -- scripts/common.sh@334 -- # local ver2 ver2_l 00:04:39.053 14:17:34 setup.sh.acl -- scripts/common.sh@336 -- # IFS=.-: 00:04:39.053 14:17:34 setup.sh.acl -- scripts/common.sh@336 -- # read -ra ver1 00:04:39.053 14:17:34 setup.sh.acl -- scripts/common.sh@337 -- # IFS=.-: 00:04:39.053 14:17:34 setup.sh.acl -- scripts/common.sh@337 -- # read -ra ver2 00:04:39.053 14:17:34 setup.sh.acl -- scripts/common.sh@338 -- # local 'op=<' 00:04:39.053 14:17:34 setup.sh.acl -- scripts/common.sh@340 -- # ver1_l=2 00:04:39.053 14:17:34 setup.sh.acl -- scripts/common.sh@341 -- # ver2_l=1 00:04:39.053 14:17:34 setup.sh.acl -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:04:39.053 14:17:34 setup.sh.acl -- scripts/common.sh@344 -- # case "$op" in 00:04:39.053 14:17:34 setup.sh.acl -- scripts/common.sh@345 -- # : 1 00:04:39.053 14:17:34 setup.sh.acl -- scripts/common.sh@364 -- # (( v = 0 )) 00:04:39.053 14:17:34 setup.sh.acl -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:04:39.053 14:17:34 setup.sh.acl -- scripts/common.sh@365 -- # decimal 1 00:04:39.053 14:17:34 setup.sh.acl -- scripts/common.sh@353 -- # local d=1 00:04:39.053 14:17:34 setup.sh.acl -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:04:39.053 14:17:34 setup.sh.acl -- scripts/common.sh@355 -- # echo 1 00:04:39.053 14:17:34 setup.sh.acl -- scripts/common.sh@365 -- # ver1[v]=1 00:04:39.053 14:17:34 setup.sh.acl -- scripts/common.sh@366 -- # decimal 2 00:04:39.053 14:17:34 setup.sh.acl -- scripts/common.sh@353 -- # local d=2 00:04:39.053 14:17:34 setup.sh.acl -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:04:39.053 14:17:34 setup.sh.acl -- scripts/common.sh@355 -- # echo 2 00:04:39.053 14:17:34 setup.sh.acl -- scripts/common.sh@366 -- # ver2[v]=2 00:04:39.053 14:17:34 setup.sh.acl -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:04:39.053 14:17:34 setup.sh.acl -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:04:39.053 14:17:34 setup.sh.acl -- scripts/common.sh@368 -- # return 0 00:04:39.053 14:17:34 setup.sh.acl -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:04:39.053 14:17:34 setup.sh.acl -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:04:39.053 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:39.053 --rc genhtml_branch_coverage=1 00:04:39.053 --rc genhtml_function_coverage=1 00:04:39.053 --rc genhtml_legend=1 00:04:39.053 --rc geninfo_all_blocks=1 00:04:39.053 --rc geninfo_unexecuted_blocks=1 00:04:39.053 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:39.053 ' 00:04:39.053 14:17:34 setup.sh.acl -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:04:39.053 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:39.053 --rc genhtml_branch_coverage=1 00:04:39.053 --rc genhtml_function_coverage=1 00:04:39.053 --rc genhtml_legend=1 00:04:39.053 --rc geninfo_all_blocks=1 00:04:39.053 --rc geninfo_unexecuted_blocks=1 00:04:39.053 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:39.053 ' 00:04:39.053 14:17:34 setup.sh.acl -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:04:39.053 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:39.053 --rc genhtml_branch_coverage=1 00:04:39.053 --rc genhtml_function_coverage=1 00:04:39.053 --rc genhtml_legend=1 00:04:39.053 --rc geninfo_all_blocks=1 00:04:39.053 --rc geninfo_unexecuted_blocks=1 00:04:39.053 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:39.053 ' 00:04:39.053 14:17:34 setup.sh.acl -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:04:39.053 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:39.053 --rc genhtml_branch_coverage=1 00:04:39.053 --rc genhtml_function_coverage=1 00:04:39.053 --rc genhtml_legend=1 00:04:39.053 --rc geninfo_all_blocks=1 00:04:39.053 --rc geninfo_unexecuted_blocks=1 00:04:39.053 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:39.053 ' 00:04:39.053 14:17:34 setup.sh.acl -- setup/acl.sh@10 -- # get_zoned_devs 00:04:39.053 14:17:34 setup.sh.acl -- common/autotest_common.sh@1657 -- # zoned_devs=() 00:04:39.053 14:17:34 setup.sh.acl -- common/autotest_common.sh@1657 -- # local -gA zoned_devs 00:04:39.053 14:17:34 setup.sh.acl -- common/autotest_common.sh@1658 -- # local nvme bdf 00:04:39.053 14:17:34 setup.sh.acl -- common/autotest_common.sh@1660 -- # for nvme in /sys/block/nvme* 00:04:39.053 14:17:34 setup.sh.acl -- common/autotest_common.sh@1661 -- # is_block_zoned nvme0n1 00:04:39.053 14:17:34 setup.sh.acl -- common/autotest_common.sh@1650 -- # local device=nvme0n1 00:04:39.053 14:17:34 setup.sh.acl -- common/autotest_common.sh@1652 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:04:39.053 14:17:34 setup.sh.acl -- common/autotest_common.sh@1653 -- # [[ none != none ]] 00:04:39.053 14:17:34 setup.sh.acl -- setup/acl.sh@12 -- # devs=() 00:04:39.053 14:17:34 setup.sh.acl -- setup/acl.sh@12 -- # declare -a devs 00:04:39.053 14:17:34 setup.sh.acl -- setup/acl.sh@13 -- # drivers=() 00:04:39.053 14:17:34 setup.sh.acl -- setup/acl.sh@13 -- # declare -A drivers 00:04:39.053 14:17:34 setup.sh.acl -- setup/acl.sh@51 -- # setup reset 00:04:39.053 14:17:34 setup.sh.acl -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:39.053 14:17:34 setup.sh.acl -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:43.251 14:17:38 setup.sh.acl -- setup/acl.sh@52 -- # collect_setup_devs 00:04:43.251 14:17:38 setup.sh.acl -- setup/acl.sh@16 -- # local dev driver 00:04:43.251 14:17:38 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:43.251 14:17:38 setup.sh.acl -- setup/acl.sh@15 -- # setup output status 00:04:43.251 14:17:38 setup.sh.acl -- setup/common.sh@9 -- # [[ output == output ]] 00:04:43.251 14:17:38 setup.sh.acl -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:04:46.547 Hugepages 00:04:46.547 node hugesize free / total 00:04:46.547 14:17:42 setup.sh.acl -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:04:46.547 14:17:42 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:04:46.547 14:17:42 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:46.547 14:17:42 setup.sh.acl -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:04:46.547 14:17:42 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:04:46.547 14:17:42 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:46.547 14:17:42 setup.sh.acl -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:04:46.547 14:17:42 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:04:46.547 14:17:42 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:46.547 00:04:46.547 Type BDF Vendor Device NUMA Driver Device Block devices 00:04:46.547 14:17:42 setup.sh.acl -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:04:46.547 14:17:42 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:04:46.547 14:17:42 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:46.547 14:17:42 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.0 == *:*:*.* ]] 00:04:46.547 14:17:42 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:46.547 14:17:42 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:46.547 14:17:42 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:46.547 14:17:42 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.1 == *:*:*.* ]] 00:04:46.547 14:17:42 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:46.547 14:17:42 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:46.547 14:17:42 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:46.547 14:17:42 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.2 == *:*:*.* ]] 00:04:46.547 14:17:42 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:46.547 14:17:42 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:46.547 14:17:42 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:46.547 14:17:42 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.3 == *:*:*.* ]] 00:04:46.547 14:17:42 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:46.548 14:17:42 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:46.548 14:17:42 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:46.548 14:17:42 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.4 == *:*:*.* ]] 00:04:46.548 14:17:42 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:46.548 14:17:42 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:46.548 14:17:42 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:46.548 14:17:42 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.5 == *:*:*.* ]] 00:04:46.548 14:17:42 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:46.548 14:17:42 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:46.548 14:17:42 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:46.548 14:17:42 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.6 == *:*:*.* ]] 00:04:46.548 14:17:42 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:46.548 14:17:42 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:46.548 14:17:42 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:46.548 14:17:42 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.7 == *:*:*.* ]] 00:04:46.548 14:17:42 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:46.548 14:17:42 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:46.548 14:17:42 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:46.548 14:17:42 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.0 == *:*:*.* ]] 00:04:46.548 14:17:42 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:46.548 14:17:42 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:46.548 14:17:42 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:46.548 14:17:42 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.1 == *:*:*.* ]] 00:04:46.548 14:17:42 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:46.548 14:17:42 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:46.548 14:17:42 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:46.548 14:17:42 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.2 == *:*:*.* ]] 00:04:46.548 14:17:42 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:46.548 14:17:42 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:46.548 14:17:42 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:46.548 14:17:42 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.3 == *:*:*.* ]] 00:04:46.548 14:17:42 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:46.548 14:17:42 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:46.548 14:17:42 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:46.548 14:17:42 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.4 == *:*:*.* ]] 00:04:46.548 14:17:42 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:46.548 14:17:42 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:46.548 14:17:42 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:46.548 14:17:42 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.5 == *:*:*.* ]] 00:04:46.548 14:17:42 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:46.548 14:17:42 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:46.548 14:17:42 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:46.548 14:17:42 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.6 == *:*:*.* ]] 00:04:46.548 14:17:42 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:46.548 14:17:42 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:46.548 14:17:42 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:46.548 14:17:42 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.7 == *:*:*.* ]] 00:04:46.548 14:17:42 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:46.548 14:17:42 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:46.548 14:17:42 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:46.548 14:17:42 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:d8:00.0 == *:*:*.* ]] 00:04:46.548 14:17:42 setup.sh.acl -- setup/acl.sh@20 -- # [[ nvme == nvme ]] 00:04:46.548 14:17:42 setup.sh.acl -- setup/acl.sh@21 -- # [[ '' == *\0\0\0\0\:\d\8\:\0\0\.\0* ]] 00:04:46.548 14:17:42 setup.sh.acl -- setup/acl.sh@22 -- # devs+=("$dev") 00:04:46.548 14:17:42 setup.sh.acl -- setup/acl.sh@22 -- # drivers["$dev"]=nvme 00:04:46.548 14:17:42 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:46.548 14:17:42 setup.sh.acl -- setup/acl.sh@24 -- # (( 1 > 0 )) 00:04:46.548 14:17:42 setup.sh.acl -- setup/acl.sh@54 -- # run_test denied denied 00:04:46.548 14:17:42 setup.sh.acl -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:04:46.548 14:17:42 setup.sh.acl -- common/autotest_common.sh@1111 -- # xtrace_disable 00:04:46.548 14:17:42 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:04:46.548 ************************************ 00:04:46.548 START TEST denied 00:04:46.548 ************************************ 00:04:46.548 14:17:42 setup.sh.acl.denied -- common/autotest_common.sh@1129 -- # denied 00:04:46.548 14:17:42 setup.sh.acl.denied -- setup/acl.sh@38 -- # PCI_BLOCKED=' 0000:d8:00.0' 00:04:46.548 14:17:42 setup.sh.acl.denied -- setup/acl.sh@38 -- # setup output config 00:04:46.548 14:17:42 setup.sh.acl.denied -- setup/acl.sh@39 -- # grep 'Skipping denied controller at 0000:d8:00.0' 00:04:46.548 14:17:42 setup.sh.acl.denied -- setup/common.sh@9 -- # [[ output == output ]] 00:04:46.548 14:17:42 setup.sh.acl.denied -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:50.748 0000:d8:00.0 (8086 0a54): Skipping denied controller at 0000:d8:00.0 00:04:50.748 14:17:46 setup.sh.acl.denied -- setup/acl.sh@40 -- # verify 0000:d8:00.0 00:04:50.748 14:17:46 setup.sh.acl.denied -- setup/acl.sh@28 -- # local dev driver 00:04:50.748 14:17:46 setup.sh.acl.denied -- setup/acl.sh@30 -- # for dev in "$@" 00:04:50.748 14:17:46 setup.sh.acl.denied -- setup/acl.sh@31 -- # [[ -e /sys/bus/pci/devices/0000:d8:00.0 ]] 00:04:50.748 14:17:46 setup.sh.acl.denied -- setup/acl.sh@32 -- # readlink -f /sys/bus/pci/devices/0000:d8:00.0/driver 00:04:50.748 14:17:46 setup.sh.acl.denied -- setup/acl.sh@32 -- # driver=/sys/bus/pci/drivers/nvme 00:04:50.748 14:17:46 setup.sh.acl.denied -- setup/acl.sh@33 -- # [[ nvme == \n\v\m\e ]] 00:04:50.748 14:17:46 setup.sh.acl.denied -- setup/acl.sh@41 -- # setup reset 00:04:50.748 14:17:46 setup.sh.acl.denied -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:50.748 14:17:46 setup.sh.acl.denied -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:54.949 00:04:54.949 real 0m8.498s 00:04:54.949 user 0m2.701s 00:04:54.949 sys 0m5.110s 00:04:54.949 14:17:50 setup.sh.acl.denied -- common/autotest_common.sh@1130 -- # xtrace_disable 00:04:54.949 14:17:50 setup.sh.acl.denied -- common/autotest_common.sh@10 -- # set +x 00:04:54.949 ************************************ 00:04:54.949 END TEST denied 00:04:54.949 ************************************ 00:04:54.949 14:17:51 setup.sh.acl -- setup/acl.sh@55 -- # run_test allowed allowed 00:04:54.949 14:17:51 setup.sh.acl -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:04:54.949 14:17:51 setup.sh.acl -- common/autotest_common.sh@1111 -- # xtrace_disable 00:04:54.949 14:17:51 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:04:55.210 ************************************ 00:04:55.210 START TEST allowed 00:04:55.210 ************************************ 00:04:55.210 14:17:51 setup.sh.acl.allowed -- common/autotest_common.sh@1129 -- # allowed 00:04:55.210 14:17:51 setup.sh.acl.allowed -- setup/acl.sh@45 -- # PCI_ALLOWED=0000:d8:00.0 00:04:55.210 14:17:51 setup.sh.acl.allowed -- setup/acl.sh@45 -- # setup output config 00:04:55.210 14:17:51 setup.sh.acl.allowed -- setup/acl.sh@46 -- # grep -E '0000:d8:00.0 .*: nvme -> .*' 00:04:55.210 14:17:51 setup.sh.acl.allowed -- setup/common.sh@9 -- # [[ output == output ]] 00:04:55.210 14:17:51 setup.sh.acl.allowed -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:05:00.500 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:05:00.500 14:17:56 setup.sh.acl.allowed -- setup/acl.sh@47 -- # verify 00:05:00.500 14:17:56 setup.sh.acl.allowed -- setup/acl.sh@28 -- # local dev driver 00:05:00.500 14:17:56 setup.sh.acl.allowed -- setup/acl.sh@48 -- # setup reset 00:05:00.500 14:17:56 setup.sh.acl.allowed -- setup/common.sh@9 -- # [[ reset == output ]] 00:05:00.500 14:17:56 setup.sh.acl.allowed -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:05:04.710 00:05:04.710 real 0m9.133s 00:05:04.710 user 0m2.591s 00:05:04.710 sys 0m5.129s 00:05:04.710 14:18:00 setup.sh.acl.allowed -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:04.710 14:18:00 setup.sh.acl.allowed -- common/autotest_common.sh@10 -- # set +x 00:05:04.710 ************************************ 00:05:04.710 END TEST allowed 00:05:04.710 ************************************ 00:05:04.710 00:05:04.710 real 0m25.533s 00:05:04.710 user 0m8.210s 00:05:04.710 sys 0m15.524s 00:05:04.710 14:18:00 setup.sh.acl -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:04.710 14:18:00 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:05:04.710 ************************************ 00:05:04.710 END TEST acl 00:05:04.710 ************************************ 00:05:04.710 14:18:00 setup.sh -- setup/test-setup.sh@13 -- # run_test hugepages /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/hugepages.sh 00:05:04.710 14:18:00 setup.sh -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:05:04.710 14:18:00 setup.sh -- common/autotest_common.sh@1111 -- # xtrace_disable 00:05:04.710 14:18:00 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:05:04.710 ************************************ 00:05:04.710 START TEST hugepages 00:05:04.710 ************************************ 00:05:04.711 14:18:00 setup.sh.hugepages -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/hugepages.sh 00:05:04.711 * Looking for test storage... 00:05:04.711 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:05:04.711 14:18:00 setup.sh.hugepages -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:05:04.711 14:18:00 setup.sh.hugepages -- common/autotest_common.sh@1693 -- # lcov --version 00:05:04.711 14:18:00 setup.sh.hugepages -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:05:04.711 14:18:00 setup.sh.hugepages -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:05:04.711 14:18:00 setup.sh.hugepages -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:05:04.711 14:18:00 setup.sh.hugepages -- scripts/common.sh@333 -- # local ver1 ver1_l 00:05:04.711 14:18:00 setup.sh.hugepages -- scripts/common.sh@334 -- # local ver2 ver2_l 00:05:04.711 14:18:00 setup.sh.hugepages -- scripts/common.sh@336 -- # IFS=.-: 00:05:04.711 14:18:00 setup.sh.hugepages -- scripts/common.sh@336 -- # read -ra ver1 00:05:04.711 14:18:00 setup.sh.hugepages -- scripts/common.sh@337 -- # IFS=.-: 00:05:04.711 14:18:00 setup.sh.hugepages -- scripts/common.sh@337 -- # read -ra ver2 00:05:04.711 14:18:00 setup.sh.hugepages -- scripts/common.sh@338 -- # local 'op=<' 00:05:04.711 14:18:00 setup.sh.hugepages -- scripts/common.sh@340 -- # ver1_l=2 00:05:04.711 14:18:00 setup.sh.hugepages -- scripts/common.sh@341 -- # ver2_l=1 00:05:04.711 14:18:00 setup.sh.hugepages -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:05:04.711 14:18:00 setup.sh.hugepages -- scripts/common.sh@344 -- # case "$op" in 00:05:04.711 14:18:00 setup.sh.hugepages -- scripts/common.sh@345 -- # : 1 00:05:04.711 14:18:00 setup.sh.hugepages -- scripts/common.sh@364 -- # (( v = 0 )) 00:05:04.711 14:18:00 setup.sh.hugepages -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:04.711 14:18:00 setup.sh.hugepages -- scripts/common.sh@365 -- # decimal 1 00:05:04.711 14:18:00 setup.sh.hugepages -- scripts/common.sh@353 -- # local d=1 00:05:04.711 14:18:00 setup.sh.hugepages -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:04.711 14:18:00 setup.sh.hugepages -- scripts/common.sh@355 -- # echo 1 00:05:04.711 14:18:00 setup.sh.hugepages -- scripts/common.sh@365 -- # ver1[v]=1 00:05:04.711 14:18:00 setup.sh.hugepages -- scripts/common.sh@366 -- # decimal 2 00:05:04.711 14:18:00 setup.sh.hugepages -- scripts/common.sh@353 -- # local d=2 00:05:04.711 14:18:00 setup.sh.hugepages -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:04.711 14:18:00 setup.sh.hugepages -- scripts/common.sh@355 -- # echo 2 00:05:04.711 14:18:00 setup.sh.hugepages -- scripts/common.sh@366 -- # ver2[v]=2 00:05:04.711 14:18:00 setup.sh.hugepages -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:05:04.711 14:18:00 setup.sh.hugepages -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:05:04.711 14:18:00 setup.sh.hugepages -- scripts/common.sh@368 -- # return 0 00:05:04.711 14:18:00 setup.sh.hugepages -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:04.711 14:18:00 setup.sh.hugepages -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:05:04.711 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:04.711 --rc genhtml_branch_coverage=1 00:05:04.711 --rc genhtml_function_coverage=1 00:05:04.711 --rc genhtml_legend=1 00:05:04.711 --rc geninfo_all_blocks=1 00:05:04.711 --rc geninfo_unexecuted_blocks=1 00:05:04.711 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:04.711 ' 00:05:04.711 14:18:00 setup.sh.hugepages -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:05:04.711 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:04.711 --rc genhtml_branch_coverage=1 00:05:04.711 --rc genhtml_function_coverage=1 00:05:04.711 --rc genhtml_legend=1 00:05:04.711 --rc geninfo_all_blocks=1 00:05:04.711 --rc geninfo_unexecuted_blocks=1 00:05:04.711 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:04.711 ' 00:05:04.711 14:18:00 setup.sh.hugepages -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:05:04.711 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:04.711 --rc genhtml_branch_coverage=1 00:05:04.711 --rc genhtml_function_coverage=1 00:05:04.711 --rc genhtml_legend=1 00:05:04.711 --rc geninfo_all_blocks=1 00:05:04.711 --rc geninfo_unexecuted_blocks=1 00:05:04.711 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:04.711 ' 00:05:04.711 14:18:00 setup.sh.hugepages -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:05:04.711 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:04.711 --rc genhtml_branch_coverage=1 00:05:04.711 --rc genhtml_function_coverage=1 00:05:04.711 --rc genhtml_legend=1 00:05:04.711 --rc geninfo_all_blocks=1 00:05:04.711 --rc geninfo_unexecuted_blocks=1 00:05:04.711 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:04.711 ' 00:05:04.711 14:18:00 setup.sh.hugepages -- setup/hugepages.sh@10 -- # nodes_sys=() 00:05:04.711 14:18:00 setup.sh.hugepages -- setup/hugepages.sh@10 -- # declare -a nodes_sys 00:05:04.711 14:18:00 setup.sh.hugepages -- setup/hugepages.sh@12 -- # declare -i default_hugepages=0 00:05:04.711 14:18:00 setup.sh.hugepages -- setup/hugepages.sh@13 -- # declare -i no_nodes=0 00:05:04.711 14:18:00 setup.sh.hugepages -- setup/hugepages.sh@14 -- # declare -i nr_hugepages=0 00:05:04.711 14:18:00 setup.sh.hugepages -- setup/hugepages.sh@16 -- # get_meminfo Hugepagesize 00:05:04.711 14:18:00 setup.sh.hugepages -- setup/common.sh@17 -- # local get=Hugepagesize 00:05:04.711 14:18:00 setup.sh.hugepages -- setup/common.sh@18 -- # local node= 00:05:04.711 14:18:00 setup.sh.hugepages -- setup/common.sh@19 -- # local var val 00:05:04.711 14:18:00 setup.sh.hugepages -- setup/common.sh@20 -- # local mem_f mem 00:05:04.711 14:18:00 setup.sh.hugepages -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:04.711 14:18:00 setup.sh.hugepages -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:04.711 14:18:00 setup.sh.hugepages -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:04.711 14:18:00 setup.sh.hugepages -- setup/common.sh@28 -- # mapfile -t mem 00:05:04.711 14:18:00 setup.sh.hugepages -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:04.711 14:18:00 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:04.711 14:18:00 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:04.711 14:18:00 setup.sh.hugepages -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283772 kB' 'MemFree: 38329312 kB' 'MemAvailable: 42057288 kB' 'Buffers: 8940 kB' 'Cached: 13232200 kB' 'SwapCached: 0 kB' 'Active: 10527040 kB' 'Inactive: 3688224 kB' 'Active(anon): 10110248 kB' 'Inactive(anon): 0 kB' 'Active(file): 416792 kB' 'Inactive(file): 3688224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 977692 kB' 'Mapped: 148104 kB' 'Shmem: 9136124 kB' 'KReclaimable: 239652 kB' 'Slab: 875052 kB' 'SReclaimable: 239652 kB' 'SUnreclaim: 635400 kB' 'KernelStack: 21952 kB' 'PageTables: 8912 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36433336 kB' 'Committed_AS: 11642156 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214196 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 2048' 'HugePages_Free: 2048' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 4194304 kB' 'DirectMap4k: 425332 kB' 'DirectMap2M: 9746432 kB' 'DirectMap1G: 59768832 kB' 00:05:04.711 14:18:00 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:04.711 14:18:00 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:04.711 14:18:00 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:04.711 14:18:00 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:04.711 14:18:00 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:04.711 14:18:00 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:04.711 14:18:00 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:04.711 14:18:00 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:04.711 14:18:00 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:04.711 14:18:00 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:04.711 14:18:00 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:04.711 14:18:00 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:04.711 14:18:00 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:04.711 14:18:00 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:04.711 14:18:00 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:04.711 14:18:00 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:04.711 14:18:00 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:04.711 14:18:00 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:04.712 14:18:00 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:04.712 14:18:00 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:04.712 14:18:00 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:04.712 14:18:00 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:04.712 14:18:00 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:04.712 14:18:00 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:04.712 14:18:00 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:04.712 14:18:00 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:04.712 14:18:00 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:04.712 14:18:00 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:04.712 14:18:00 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:04.712 14:18:00 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:04.712 14:18:00 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:04.712 14:18:00 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:04.712 14:18:00 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:04.712 14:18:00 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:04.712 14:18:00 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:04.712 14:18:00 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:04.712 14:18:00 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:04.712 14:18:00 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:04.712 14:18:00 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:04.712 14:18:00 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:04.712 14:18:00 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:04.712 14:18:00 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:04.712 14:18:00 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:04.712 14:18:00 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:04.712 14:18:00 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:04.712 14:18:00 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:04.712 14:18:00 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:04.712 14:18:00 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:04.712 14:18:00 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:04.712 14:18:00 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:04.712 14:18:00 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:04.712 14:18:00 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:04.712 14:18:00 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:04.712 14:18:00 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:04.712 14:18:00 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:04.712 14:18:00 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:04.712 14:18:00 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:04.712 14:18:00 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:04.712 14:18:00 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:04.712 14:18:00 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:04.712 14:18:00 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:04.712 14:18:00 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:04.712 14:18:00 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:04.712 14:18:00 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:04.712 14:18:00 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:04.712 14:18:00 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:04.712 14:18:00 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:04.712 14:18:00 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:04.712 14:18:00 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:04.712 14:18:00 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:04.712 14:18:00 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:04.712 14:18:00 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:04.712 14:18:00 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:04.712 14:18:00 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:04.712 14:18:00 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:04.712 14:18:00 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:04.712 14:18:00 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:04.712 14:18:00 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:04.712 14:18:00 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:04.712 14:18:00 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:04.712 14:18:00 setup.sh.hugepages -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:04.712 14:18:00 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:04.712 14:18:00 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:04.712 14:18:00 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:04.712 14:18:00 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:04.712 14:18:00 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:04.712 14:18:00 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:04.712 14:18:00 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:04.712 14:18:00 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:04.712 14:18:00 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:04.712 14:18:00 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:04.712 14:18:00 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:04.712 14:18:00 setup.sh.hugepages -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:04.712 14:18:00 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:04.712 14:18:00 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:04.712 14:18:00 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:04.712 14:18:00 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:04.712 14:18:00 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:04.712 14:18:00 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:04.712 14:18:00 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:04.712 14:18:00 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:04.712 14:18:00 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:04.712 14:18:00 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:04.712 14:18:00 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:04.712 14:18:00 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:04.712 14:18:00 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:04.712 14:18:00 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:04.712 14:18:00 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:04.712 14:18:00 setup.sh.hugepages -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:04.712 14:18:00 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:04.712 14:18:00 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:04.712 14:18:00 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:04.712 14:18:00 setup.sh.hugepages -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:04.712 14:18:00 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:04.712 14:18:00 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:04.712 14:18:00 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:04.712 14:18:00 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:04.712 14:18:00 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:04.712 14:18:00 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:04.712 14:18:00 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:04.712 14:18:00 setup.sh.hugepages -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:04.712 14:18:00 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:04.712 14:18:00 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:04.712 14:18:00 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Hugepagesize == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/common.sh@33 -- # echo 2048 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/common.sh@33 -- # return 0 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/hugepages.sh@16 -- # default_hugepages=2048 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/hugepages.sh@17 -- # default_huge_nr=/sys/kernel/mm/hugepages/hugepages-2048kB/nr_hugepages 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/hugepages.sh@18 -- # global_huge_nr=/proc/sys/vm/nr_hugepages 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/hugepages.sh@21 -- # unset -v HUGEMEM 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/hugepages.sh@22 -- # unset -v HUGENODE 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/hugepages.sh@23 -- # unset -v NRHUGE 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/hugepages.sh@197 -- # get_nodes 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/hugepages.sh@26 -- # local node 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=1024 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=1024 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/hugepages.sh@31 -- # no_nodes=2 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/hugepages.sh@32 -- # (( no_nodes > 0 )) 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/hugepages.sh@198 -- # clear_hp 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/hugepages.sh@36 -- # local node hp 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/hugepages.sh@38 -- # for node in "${!nodes_sys[@]}" 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/hugepages.sh@40 -- # echo 0 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/hugepages.sh@40 -- # echo 0 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/hugepages.sh@38 -- # for node in "${!nodes_sys[@]}" 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/hugepages.sh@40 -- # echo 0 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/hugepages.sh@40 -- # echo 0 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/hugepages.sh@44 -- # export CLEAR_HUGE=yes 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/hugepages.sh@44 -- # CLEAR_HUGE=yes 00:05:04.713 14:18:00 setup.sh.hugepages -- setup/hugepages.sh@200 -- # run_test single_node_setup single_node_setup 00:05:04.713 14:18:00 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:05:04.714 14:18:00 setup.sh.hugepages -- common/autotest_common.sh@1111 -- # xtrace_disable 00:05:04.714 14:18:00 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:05:04.714 ************************************ 00:05:04.714 START TEST single_node_setup 00:05:04.714 ************************************ 00:05:04.714 14:18:00 setup.sh.hugepages.single_node_setup -- common/autotest_common.sh@1129 -- # single_node_setup 00:05:04.714 14:18:00 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@135 -- # get_test_nr_hugepages 2097152 0 00:05:04.714 14:18:00 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@48 -- # local size=2097152 00:05:04.714 14:18:00 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@49 -- # (( 2 > 1 )) 00:05:04.714 14:18:00 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@50 -- # shift 00:05:04.714 14:18:00 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@51 -- # node_ids=('0') 00:05:04.714 14:18:00 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@51 -- # local node_ids 00:05:04.714 14:18:00 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@54 -- # (( size >= default_hugepages )) 00:05:04.714 14:18:00 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@56 -- # nr_hugepages=1024 00:05:04.714 14:18:00 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@57 -- # get_test_nr_hugepages_per_node 0 00:05:04.714 14:18:00 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@61 -- # user_nodes=('0') 00:05:04.714 14:18:00 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@61 -- # local user_nodes 00:05:04.714 14:18:00 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@63 -- # local _nr_hugepages=1024 00:05:04.714 14:18:00 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@64 -- # local _no_nodes=2 00:05:04.714 14:18:00 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@66 -- # nodes_test=() 00:05:04.714 14:18:00 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@66 -- # local -g nodes_test 00:05:04.714 14:18:00 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@68 -- # (( 1 > 0 )) 00:05:04.714 14:18:00 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@69 -- # for _no_nodes in "${user_nodes[@]}" 00:05:04.714 14:18:00 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@70 -- # nodes_test[_no_nodes]=1024 00:05:04.714 14:18:00 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@72 -- # return 0 00:05:04.714 14:18:00 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@136 -- # NRHUGE=1024 00:05:04.714 14:18:00 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@136 -- # HUGENODE=0 00:05:04.714 14:18:00 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@136 -- # setup output 00:05:04.714 14:18:00 setup.sh.hugepages.single_node_setup -- setup/common.sh@9 -- # [[ output == output ]] 00:05:04.714 14:18:00 setup.sh.hugepages.single_node_setup -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:08.013 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:08.013 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:08.013 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:08.013 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:08.013 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:08.013 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:08.273 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:08.274 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:08.274 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:08.274 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:08.274 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:08.274 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:08.274 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:08.274 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:08.274 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:08.274 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:10.192 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:05:10.192 14:18:05 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@137 -- # verify_nr_hugepages 00:05:10.192 14:18:05 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@88 -- # local node 00:05:10.192 14:18:05 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@89 -- # local sorted_t 00:05:10.192 14:18:05 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@90 -- # local sorted_s 00:05:10.192 14:18:05 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@91 -- # local surp 00:05:10.192 14:18:05 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@92 -- # local resv 00:05:10.192 14:18:05 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@93 -- # local anon 00:05:10.192 14:18:05 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@95 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:10.192 14:18:05 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@96 -- # get_meminfo AnonHugePages 00:05:10.192 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:10.192 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@18 -- # local node= 00:05:10.192 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@19 -- # local var val 00:05:10.192 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@20 -- # local mem_f mem 00:05:10.192 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:10.192 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:10.192 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:10.192 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@28 -- # mapfile -t mem 00:05:10.193 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:10.193 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.193 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.193 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283772 kB' 'MemFree: 40493860 kB' 'MemAvailable: 44221156 kB' 'Buffers: 8940 kB' 'Cached: 13232356 kB' 'SwapCached: 0 kB' 'Active: 10528884 kB' 'Inactive: 3688224 kB' 'Active(anon): 10112092 kB' 'Inactive(anon): 0 kB' 'Active(file): 416792 kB' 'Inactive(file): 3688224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 979240 kB' 'Mapped: 147596 kB' 'Shmem: 9136280 kB' 'KReclaimable: 238292 kB' 'Slab: 872692 kB' 'SReclaimable: 238292 kB' 'SUnreclaim: 634400 kB' 'KernelStack: 22016 kB' 'PageTables: 8736 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481912 kB' 'Committed_AS: 11640484 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214404 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 425332 kB' 'DirectMap2M: 9746432 kB' 'DirectMap1G: 59768832 kB' 00:05:10.193 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.193 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.193 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.193 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.193 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.193 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.193 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.193 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.193 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.193 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.193 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.193 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.193 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.193 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.193 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.193 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.193 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.193 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.193 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.193 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.193 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.193 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.193 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.193 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.193 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.193 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.193 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.193 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.193 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.193 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.193 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.193 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.193 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.193 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.193 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.193 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.193 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.193 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.193 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.193 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.193 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.193 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.193 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.193 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.193 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.193 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.193 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.193 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.193 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.193 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.193 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.193 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.193 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.193 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.193 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.193 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.193 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.193 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.193 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.193 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.193 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.193 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.193 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.193 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.193 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.193 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.193 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.193 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.193 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.193 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.193 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.193 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.193 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.193 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.193 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.193 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.193 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.193 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.193 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.193 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.193 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.193 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.193 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.193 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.193 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.193 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.193 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.193 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.193 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.193 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.193 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.193 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.193 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.193 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.193 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.193 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.193 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.193 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.193 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.193 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.194 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.194 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.194 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.194 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.194 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.194 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.194 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.194 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.194 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.194 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.194 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.194 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.194 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.194 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.194 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.194 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.194 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.194 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.194 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.194 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.194 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.194 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.194 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.194 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.194 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.194 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.194 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.194 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.194 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.194 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.194 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.194 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.194 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.194 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.194 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.194 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.194 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.194 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.194 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.194 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.194 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.194 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.194 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.194 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.194 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.194 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.194 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.194 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.194 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.194 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.194 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.194 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.194 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.194 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.194 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.194 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.194 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.194 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.194 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.194 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.194 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.194 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # echo 0 00:05:10.194 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # return 0 00:05:10.194 14:18:05 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@96 -- # anon=0 00:05:10.194 14:18:05 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@98 -- # get_meminfo HugePages_Surp 00:05:10.194 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:10.194 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@18 -- # local node= 00:05:10.194 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@19 -- # local var val 00:05:10.194 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@20 -- # local mem_f mem 00:05:10.194 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:10.194 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:10.194 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:10.194 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@28 -- # mapfile -t mem 00:05:10.194 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:10.194 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.194 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.194 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283772 kB' 'MemFree: 40495300 kB' 'MemAvailable: 44222596 kB' 'Buffers: 8940 kB' 'Cached: 13232356 kB' 'SwapCached: 0 kB' 'Active: 10523112 kB' 'Inactive: 3688224 kB' 'Active(anon): 10106320 kB' 'Inactive(anon): 0 kB' 'Active(file): 416792 kB' 'Inactive(file): 3688224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 973420 kB' 'Mapped: 147056 kB' 'Shmem: 9136280 kB' 'KReclaimable: 238292 kB' 'Slab: 872708 kB' 'SReclaimable: 238292 kB' 'SUnreclaim: 634416 kB' 'KernelStack: 21968 kB' 'PageTables: 8924 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481912 kB' 'Committed_AS: 11634180 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214352 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 425332 kB' 'DirectMap2M: 9746432 kB' 'DirectMap1G: 59768832 kB' 00:05:10.194 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.194 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.194 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.194 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.194 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.194 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.194 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.194 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.194 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.194 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.194 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.194 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.194 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.194 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.194 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.194 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.194 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.194 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.194 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.194 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.194 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.194 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.194 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.194 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.194 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.194 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.194 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.194 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.195 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.196 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.196 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.196 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.196 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.196 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.196 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.196 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.196 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.196 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.196 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.196 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.196 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.196 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.196 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.196 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.196 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.196 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.196 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.196 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.196 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.196 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.196 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.196 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.196 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.196 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.196 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.196 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.196 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.196 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.196 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.196 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.196 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.196 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.196 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.196 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.196 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.196 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.196 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.196 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.196 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.196 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.196 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.196 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.196 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.196 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.196 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.196 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.196 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.196 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.196 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.196 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.196 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.196 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.196 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.196 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.196 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.196 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.196 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.196 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.196 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.196 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.196 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # echo 0 00:05:10.196 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # return 0 00:05:10.196 14:18:05 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@98 -- # surp=0 00:05:10.196 14:18:05 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Rsvd 00:05:10.196 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:10.196 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@18 -- # local node= 00:05:10.196 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@19 -- # local var val 00:05:10.196 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@20 -- # local mem_f mem 00:05:10.196 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:10.196 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:10.196 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:10.196 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@28 -- # mapfile -t mem 00:05:10.196 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:10.196 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.196 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.196 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283772 kB' 'MemFree: 40499588 kB' 'MemAvailable: 44226884 kB' 'Buffers: 8940 kB' 'Cached: 13232368 kB' 'SwapCached: 0 kB' 'Active: 10522860 kB' 'Inactive: 3688224 kB' 'Active(anon): 10106068 kB' 'Inactive(anon): 0 kB' 'Active(file): 416792 kB' 'Inactive(file): 3688224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 973168 kB' 'Mapped: 147436 kB' 'Shmem: 9136292 kB' 'KReclaimable: 238292 kB' 'Slab: 872724 kB' 'SReclaimable: 238292 kB' 'SUnreclaim: 634432 kB' 'KernelStack: 21936 kB' 'PageTables: 8520 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481912 kB' 'Committed_AS: 11632916 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214304 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 425332 kB' 'DirectMap2M: 9746432 kB' 'DirectMap1G: 59768832 kB' 00:05:10.196 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.196 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.196 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.196 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.196 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.196 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.196 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.196 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.196 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.196 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.196 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.196 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.196 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.196 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.196 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.196 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.196 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.196 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.196 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.196 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.196 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.196 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.196 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.196 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.196 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.196 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.196 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.196 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.197 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.197 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.197 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.197 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.197 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.197 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.197 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.197 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.197 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.197 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.197 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.197 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.197 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.197 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.197 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.197 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.197 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.197 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.197 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.197 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.197 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.197 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.197 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.197 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.197 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.197 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.197 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.197 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.197 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.197 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.197 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.197 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.197 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.197 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.197 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.197 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.197 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.197 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.197 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.197 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.197 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.197 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.197 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.197 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.197 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.197 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.197 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.197 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.197 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.197 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.197 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.197 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.197 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.197 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.197 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.197 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.197 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.197 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.197 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.197 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.197 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.197 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.197 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.197 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.197 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.197 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.197 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.197 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.197 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.197 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.197 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.197 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.197 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.197 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.197 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.197 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.197 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.197 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.197 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.197 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.197 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.197 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.197 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.197 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.197 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.197 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.197 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.197 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.197 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.197 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.197 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.197 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.197 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.197 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.197 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.197 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.197 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.197 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.197 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.198 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.198 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.198 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.198 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.198 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.198 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.198 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.198 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.198 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.198 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.198 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.198 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.198 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.198 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.198 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.198 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.198 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.198 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.198 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.198 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.198 14:18:05 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.198 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.198 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.198 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.198 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.198 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.198 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.198 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.198 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.198 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.198 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.198 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.198 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.198 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.198 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.198 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.198 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.198 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.198 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.198 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.198 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.198 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.198 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.198 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.198 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.198 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.198 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.198 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.198 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.198 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.198 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.198 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.198 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.198 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.198 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.198 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.198 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.198 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.198 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.198 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.198 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.198 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.198 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.198 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.198 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.198 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.198 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.198 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.198 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.198 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.198 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.198 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.198 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.198 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.198 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # echo 0 00:05:10.198 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # return 0 00:05:10.198 14:18:06 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@99 -- # resv=0 00:05:10.198 14:18:06 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@101 -- # echo nr_hugepages=1024 00:05:10.198 nr_hugepages=1024 00:05:10.198 14:18:06 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@102 -- # echo resv_hugepages=0 00:05:10.198 resv_hugepages=0 00:05:10.198 14:18:06 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@103 -- # echo surplus_hugepages=0 00:05:10.198 surplus_hugepages=0 00:05:10.198 14:18:06 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@104 -- # echo anon_hugepages=0 00:05:10.198 anon_hugepages=0 00:05:10.198 14:18:06 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@106 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:10.198 14:18:06 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@108 -- # (( 1024 == nr_hugepages )) 00:05:10.198 14:18:06 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@109 -- # get_meminfo HugePages_Total 00:05:10.198 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:10.198 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@18 -- # local node= 00:05:10.198 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@19 -- # local var val 00:05:10.198 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@20 -- # local mem_f mem 00:05:10.198 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:10.198 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:10.198 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:10.198 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@28 -- # mapfile -t mem 00:05:10.198 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:10.198 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.198 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.198 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283772 kB' 'MemFree: 40499408 kB' 'MemAvailable: 44226704 kB' 'Buffers: 8940 kB' 'Cached: 13232400 kB' 'SwapCached: 0 kB' 'Active: 10522440 kB' 'Inactive: 3688224 kB' 'Active(anon): 10105648 kB' 'Inactive(anon): 0 kB' 'Active(file): 416792 kB' 'Inactive(file): 3688224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 972696 kB' 'Mapped: 146976 kB' 'Shmem: 9136324 kB' 'KReclaimable: 238292 kB' 'Slab: 872692 kB' 'SReclaimable: 238292 kB' 'SUnreclaim: 634400 kB' 'KernelStack: 21840 kB' 'PageTables: 8548 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481912 kB' 'Committed_AS: 11634224 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214288 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 425332 kB' 'DirectMap2M: 9746432 kB' 'DirectMap1G: 59768832 kB' 00:05:10.198 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.198 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.198 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.198 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.198 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.198 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.198 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.199 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.200 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.200 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.200 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.200 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.200 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.200 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.200 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.200 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.200 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.200 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.200 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.200 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.200 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.200 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.200 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.200 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.200 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.200 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.200 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.200 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.200 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.200 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.200 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.200 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.200 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.200 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.200 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.200 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.200 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.200 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.200 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.200 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.200 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.200 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.200 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.200 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.200 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.200 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.200 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.200 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.200 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.200 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.200 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.200 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.200 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.200 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.200 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.200 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.200 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.200 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.200 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.200 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.200 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.200 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.200 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.200 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.200 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.200 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.200 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.200 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.200 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.200 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.200 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.200 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.200 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.200 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.200 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.200 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.200 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.200 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.200 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # echo 1024 00:05:10.200 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # return 0 00:05:10.200 14:18:06 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:10.200 14:18:06 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@111 -- # get_nodes 00:05:10.200 14:18:06 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@26 -- # local node 00:05:10.200 14:18:06 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:10.200 14:18:06 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=1024 00:05:10.200 14:18:06 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:10.200 14:18:06 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=0 00:05:10.200 14:18:06 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@31 -- # no_nodes=2 00:05:10.200 14:18:06 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@32 -- # (( no_nodes > 0 )) 00:05:10.200 14:18:06 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:05:10.200 14:18:06 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:05:10.200 14:18:06 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 0 00:05:10.200 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:10.200 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@18 -- # local node=0 00:05:10.200 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@19 -- # local var val 00:05:10.200 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@20 -- # local mem_f mem 00:05:10.200 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:10.200 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:10.200 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:10.200 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@28 -- # mapfile -t mem 00:05:10.200 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:10.200 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.200 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.200 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32585368 kB' 'MemFree: 17400392 kB' 'MemUsed: 15184976 kB' 'SwapCached: 0 kB' 'Active: 7536264 kB' 'Inactive: 3531400 kB' 'Active(anon): 7259124 kB' 'Inactive(anon): 0 kB' 'Active(file): 277140 kB' 'Inactive(file): 3531400 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 10518304 kB' 'Mapped: 128288 kB' 'AnonPages: 552564 kB' 'Shmem: 6709764 kB' 'KernelStack: 12776 kB' 'PageTables: 5548 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 159408 kB' 'Slab: 473444 kB' 'SReclaimable: 159408 kB' 'SUnreclaim: 314036 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:05:10.200 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.200 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.200 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.200 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.200 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.200 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.200 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.200 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.200 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.200 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.200 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.200 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.200 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.201 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.202 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.202 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.202 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.202 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.202 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.202 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.202 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.202 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.202 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.202 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.202 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:05:10.202 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:10.202 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:10.202 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.202 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # echo 0 00:05:10.202 14:18:06 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # return 0 00:05:10.202 14:18:06 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:05:10.202 14:18:06 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:05:10.202 14:18:06 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:05:10.202 14:18:06 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:05:10.202 14:18:06 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@127 -- # echo 'node0=1024 expecting 1024' 00:05:10.202 node0=1024 expecting 1024 00:05:10.202 14:18:06 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@129 -- # [[ 1024 == \1\0\2\4 ]] 00:05:10.202 00:05:10.202 real 0m5.389s 00:05:10.202 user 0m1.443s 00:05:10.202 sys 0m2.503s 00:05:10.202 14:18:06 setup.sh.hugepages.single_node_setup -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:10.202 14:18:06 setup.sh.hugepages.single_node_setup -- common/autotest_common.sh@10 -- # set +x 00:05:10.202 ************************************ 00:05:10.202 END TEST single_node_setup 00:05:10.202 ************************************ 00:05:10.202 14:18:06 setup.sh.hugepages -- setup/hugepages.sh@201 -- # run_test even_2G_alloc even_2G_alloc 00:05:10.202 14:18:06 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:05:10.202 14:18:06 setup.sh.hugepages -- common/autotest_common.sh@1111 -- # xtrace_disable 00:05:10.202 14:18:06 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:05:10.202 ************************************ 00:05:10.202 START TEST even_2G_alloc 00:05:10.202 ************************************ 00:05:10.202 14:18:06 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@1129 -- # even_2G_alloc 00:05:10.202 14:18:06 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@142 -- # get_test_nr_hugepages 2097152 00:05:10.202 14:18:06 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@48 -- # local size=2097152 00:05:10.202 14:18:06 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@49 -- # (( 1 > 1 )) 00:05:10.202 14:18:06 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@54 -- # (( size >= default_hugepages )) 00:05:10.202 14:18:06 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@56 -- # nr_hugepages=1024 00:05:10.202 14:18:06 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@57 -- # get_test_nr_hugepages_per_node 00:05:10.202 14:18:06 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@61 -- # user_nodes=() 00:05:10.202 14:18:06 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@61 -- # local user_nodes 00:05:10.202 14:18:06 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@63 -- # local _nr_hugepages=1024 00:05:10.202 14:18:06 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@64 -- # local _no_nodes=2 00:05:10.202 14:18:06 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@66 -- # nodes_test=() 00:05:10.202 14:18:06 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@66 -- # local -g nodes_test 00:05:10.202 14:18:06 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@68 -- # (( 0 > 0 )) 00:05:10.202 14:18:06 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@73 -- # (( 0 > 0 )) 00:05:10.202 14:18:06 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:05:10.202 14:18:06 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # nodes_test[_no_nodes - 1]=512 00:05:10.202 14:18:06 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@82 -- # : 512 00:05:10.202 14:18:06 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@83 -- # : 1 00:05:10.202 14:18:06 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:05:10.202 14:18:06 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # nodes_test[_no_nodes - 1]=512 00:05:10.202 14:18:06 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@82 -- # : 0 00:05:10.202 14:18:06 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@83 -- # : 0 00:05:10.202 14:18:06 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:05:10.202 14:18:06 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@143 -- # NRHUGE=1024 00:05:10.202 14:18:06 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@143 -- # setup output 00:05:10.202 14:18:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:05:10.202 14:18:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:13.597 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:05:13.597 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:05:13.597 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:05:13.597 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:05:13.597 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:05:13.597 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:05:13.597 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:05:13.598 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:05:13.598 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:05:13.598 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:05:13.598 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:05:13.598 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:05:13.598 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:05:13.598 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:05:13.598 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:05:13.598 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:05:13.598 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:05:13.598 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@144 -- # verify_nr_hugepages 00:05:13.598 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@88 -- # local node 00:05:13.598 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@89 -- # local sorted_t 00:05:13.598 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@90 -- # local sorted_s 00:05:13.598 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@91 -- # local surp 00:05:13.598 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@92 -- # local resv 00:05:13.598 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@93 -- # local anon 00:05:13.598 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@95 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:13.598 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@96 -- # get_meminfo AnonHugePages 00:05:13.598 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:13.598 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:05:13.598 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:05:13.598 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:13.598 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:13.598 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:13.598 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:13.598 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:13.598 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:13.598 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.598 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.598 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283772 kB' 'MemFree: 40542808 kB' 'MemAvailable: 44270072 kB' 'Buffers: 8940 kB' 'Cached: 13232520 kB' 'SwapCached: 0 kB' 'Active: 10521332 kB' 'Inactive: 3688224 kB' 'Active(anon): 10104540 kB' 'Inactive(anon): 0 kB' 'Active(file): 416792 kB' 'Inactive(file): 3688224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 971280 kB' 'Mapped: 145960 kB' 'Shmem: 9136444 kB' 'KReclaimable: 238228 kB' 'Slab: 872484 kB' 'SReclaimable: 238228 kB' 'SUnreclaim: 634256 kB' 'KernelStack: 21760 kB' 'PageTables: 8148 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481912 kB' 'Committed_AS: 11623904 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214352 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 425332 kB' 'DirectMap2M: 9746432 kB' 'DirectMap1G: 59768832 kB' 00:05:13.598 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.598 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.598 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.598 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.598 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.598 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.598 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.598 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.598 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.598 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.598 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.598 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.598 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.598 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.598 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.598 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.598 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.598 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.598 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.598 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.598 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.598 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.598 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.598 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.598 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.598 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.598 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.598 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.598 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.598 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.598 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.598 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.598 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.598 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.598 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.598 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.598 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.598 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.598 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.598 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.598 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.598 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.598 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.598 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.598 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.598 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.598 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.598 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.598 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.598 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.598 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.598 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.598 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.598 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.598 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.598 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.598 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.598 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.598 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.598 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.598 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.598 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.598 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.598 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.598 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.598 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.598 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.598 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.598 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.598 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.598 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.598 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.598 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.599 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.599 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.599 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.599 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.599 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.599 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.599 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.599 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.599 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.599 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.599 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.599 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.599 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.599 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.599 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.599 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.599 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.599 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.599 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.599 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.599 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.599 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.599 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.599 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.599 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.599 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.599 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.599 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.599 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.599 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.599 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.599 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.599 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.599 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.599 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.599 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.599 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.599 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.599 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.599 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.599 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.599 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.599 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.599 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.599 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.599 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.599 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.599 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.599 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.599 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.599 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.599 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.599 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.599 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.599 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.599 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.599 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.599 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.599 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.599 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.599 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.599 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.599 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.599 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.599 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.599 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.599 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.599 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.599 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.599 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.599 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.599 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.599 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.599 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.599 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.599 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.599 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.599 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.599 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.599 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.599 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.599 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.599 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.599 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.599 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.599 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.599 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.599 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.599 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:05:13.599 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:05:13.599 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@96 -- # anon=0 00:05:13.599 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@98 -- # get_meminfo HugePages_Surp 00:05:13.599 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:13.599 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:05:13.599 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:05:13.599 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:13.599 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:13.599 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:13.599 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:13.599 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:13.599 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:13.599 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.599 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283772 kB' 'MemFree: 40543148 kB' 'MemAvailable: 44270412 kB' 'Buffers: 8940 kB' 'Cached: 13232540 kB' 'SwapCached: 0 kB' 'Active: 10521180 kB' 'Inactive: 3688224 kB' 'Active(anon): 10104388 kB' 'Inactive(anon): 0 kB' 'Active(file): 416792 kB' 'Inactive(file): 3688224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 971108 kB' 'Mapped: 145848 kB' 'Shmem: 9136464 kB' 'KReclaimable: 238228 kB' 'Slab: 872484 kB' 'SReclaimable: 238228 kB' 'SUnreclaim: 634256 kB' 'KernelStack: 21744 kB' 'PageTables: 8080 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481912 kB' 'Committed_AS: 11623920 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214352 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 425332 kB' 'DirectMap2M: 9746432 kB' 'DirectMap1G: 59768832 kB' 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.600 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.601 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.601 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.601 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.601 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.601 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.601 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.601 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.601 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.601 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.601 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.601 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.601 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.601 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.601 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.601 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.601 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.601 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.601 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.601 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.601 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.601 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.601 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.601 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.601 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.601 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.601 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.601 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.601 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.601 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.601 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.601 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.601 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.601 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.601 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.601 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.601 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.601 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.601 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.601 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.601 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.601 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.601 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.601 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.601 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.601 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.601 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.601 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.601 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.601 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.601 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.601 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.601 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.601 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.601 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.601 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.601 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.601 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.601 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.601 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.601 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.601 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.601 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.601 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.601 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.601 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.601 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.601 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.601 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.601 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.601 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.601 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.601 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.601 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.601 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.601 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.601 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.601 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.601 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.601 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.601 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.601 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.601 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.601 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.601 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.601 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.601 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.601 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.601 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.601 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.601 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.601 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:05:13.601 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:05:13.601 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@98 -- # surp=0 00:05:13.601 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Rsvd 00:05:13.601 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:13.601 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:05:13.601 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:05:13.601 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:13.601 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:13.601 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:13.601 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:13.601 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:13.601 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:13.601 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.601 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.602 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283772 kB' 'MemFree: 40545800 kB' 'MemAvailable: 44273064 kB' 'Buffers: 8940 kB' 'Cached: 13232544 kB' 'SwapCached: 0 kB' 'Active: 10521620 kB' 'Inactive: 3688224 kB' 'Active(anon): 10104828 kB' 'Inactive(anon): 0 kB' 'Active(file): 416792 kB' 'Inactive(file): 3688224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 971536 kB' 'Mapped: 145848 kB' 'Shmem: 9136468 kB' 'KReclaimable: 238228 kB' 'Slab: 872484 kB' 'SReclaimable: 238228 kB' 'SUnreclaim: 634256 kB' 'KernelStack: 21760 kB' 'PageTables: 8132 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481912 kB' 'Committed_AS: 11623772 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214368 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 425332 kB' 'DirectMap2M: 9746432 kB' 'DirectMap1G: 59768832 kB' 00:05:13.602 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.602 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.602 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.602 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.602 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.602 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.602 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.602 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.602 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.602 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.602 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.602 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.602 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.602 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.602 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.602 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.602 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.602 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.602 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.602 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.602 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.602 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.602 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.602 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.602 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.602 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.602 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.602 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.602 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.602 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.602 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.602 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.602 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.602 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.602 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.602 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.602 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.602 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.602 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.602 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.602 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.602 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.602 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.602 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.602 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.602 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.602 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.602 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.602 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.602 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.602 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.602 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.602 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.602 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.602 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.602 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.602 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.602 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.602 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.602 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.602 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.602 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.602 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.602 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.602 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.602 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.602 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.602 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.602 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.602 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.602 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.602 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.602 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.602 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.602 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.602 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.602 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.602 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.602 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.602 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.602 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.602 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.602 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.866 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.866 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.866 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.866 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.866 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.866 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.866 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.866 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.866 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.866 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.866 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.866 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.866 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.866 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.866 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.866 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.866 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.866 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.866 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.866 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.866 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.866 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.866 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.866 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.866 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.866 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.866 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.866 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.867 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.867 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.867 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.867 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.867 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.867 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.867 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.867 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.867 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.867 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.867 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.867 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.867 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.867 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.867 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.867 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.867 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.867 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.867 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.867 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.867 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.867 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.867 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.867 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.867 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.867 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.867 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.867 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.867 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.867 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.867 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.867 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.867 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.867 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.867 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.867 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.867 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.867 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.867 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.867 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.867 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.867 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.867 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.867 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.867 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.867 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.867 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.867 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.867 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.867 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.867 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.867 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.867 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.867 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.867 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.867 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.867 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.867 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.867 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.867 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.867 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.867 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.867 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.867 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.867 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.867 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.867 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.867 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.867 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.867 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.867 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.867 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.867 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.867 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.867 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.867 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.867 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.867 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.867 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.867 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.867 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.867 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.867 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.867 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.867 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.867 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.867 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.867 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.867 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.867 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.867 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:05:13.867 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:05:13.867 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@99 -- # resv=0 00:05:13.867 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@101 -- # echo nr_hugepages=1024 00:05:13.867 nr_hugepages=1024 00:05:13.867 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@102 -- # echo resv_hugepages=0 00:05:13.867 resv_hugepages=0 00:05:13.867 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@103 -- # echo surplus_hugepages=0 00:05:13.867 surplus_hugepages=0 00:05:13.867 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@104 -- # echo anon_hugepages=0 00:05:13.867 anon_hugepages=0 00:05:13.867 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@106 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:13.867 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@108 -- # (( 1024 == nr_hugepages )) 00:05:13.867 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@109 -- # get_meminfo HugePages_Total 00:05:13.867 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:13.867 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:05:13.867 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:05:13.867 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:13.868 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:13.868 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:13.868 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:13.868 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:13.868 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:13.868 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.868 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.868 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283772 kB' 'MemFree: 40545080 kB' 'MemAvailable: 44272344 kB' 'Buffers: 8940 kB' 'Cached: 13232544 kB' 'SwapCached: 0 kB' 'Active: 10521912 kB' 'Inactive: 3688224 kB' 'Active(anon): 10105120 kB' 'Inactive(anon): 0 kB' 'Active(file): 416792 kB' 'Inactive(file): 3688224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 971916 kB' 'Mapped: 146352 kB' 'Shmem: 9136468 kB' 'KReclaimable: 238228 kB' 'Slab: 872484 kB' 'SReclaimable: 238228 kB' 'SUnreclaim: 634256 kB' 'KernelStack: 21792 kB' 'PageTables: 8244 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481912 kB' 'Committed_AS: 11627720 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214368 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 425332 kB' 'DirectMap2M: 9746432 kB' 'DirectMap1G: 59768832 kB' 00:05:13.868 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.868 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.868 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.868 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.868 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.868 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.868 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.868 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.868 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.868 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.868 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.868 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.868 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.868 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.868 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.868 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.868 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.868 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.868 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.868 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.868 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.868 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.868 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.868 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.868 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.868 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.868 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.868 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.868 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.868 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.868 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.868 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.868 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.868 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.868 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.868 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.868 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.868 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.868 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.868 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.868 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.868 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.868 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.868 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.868 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.868 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.868 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.868 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.868 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.868 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.868 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.868 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.868 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.868 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.868 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.868 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.868 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.868 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.868 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.868 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.868 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.868 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.868 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.868 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.868 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.868 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.868 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.868 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.868 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.868 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.868 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.868 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.868 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.868 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.868 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.868 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.869 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.870 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.870 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.870 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.870 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.870 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.870 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.870 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 1024 00:05:13.870 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:05:13.870 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:13.870 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@111 -- # get_nodes 00:05:13.870 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@26 -- # local node 00:05:13.870 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:13.870 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=512 00:05:13.870 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:13.870 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=512 00:05:13.870 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@31 -- # no_nodes=2 00:05:13.870 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@32 -- # (( no_nodes > 0 )) 00:05:13.870 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:05:13.870 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:05:13.870 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 0 00:05:13.870 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:13.870 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node=0 00:05:13.870 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:05:13.870 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:13.870 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:13.870 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:13.870 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:13.870 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:13.870 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:13.870 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.870 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.870 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32585368 kB' 'MemFree: 18480592 kB' 'MemUsed: 14104776 kB' 'SwapCached: 0 kB' 'Active: 7536080 kB' 'Inactive: 3531400 kB' 'Active(anon): 7258940 kB' 'Inactive(anon): 0 kB' 'Active(file): 277140 kB' 'Inactive(file): 3531400 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 10518332 kB' 'Mapped: 128064 kB' 'AnonPages: 552348 kB' 'Shmem: 6709792 kB' 'KernelStack: 12584 kB' 'PageTables: 5112 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 159408 kB' 'Slab: 473328 kB' 'SReclaimable: 159408 kB' 'SUnreclaim: 313920 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:05:13.870 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.870 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.870 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.870 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.870 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.870 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.870 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.870 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.870 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.870 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.870 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.870 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.870 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.870 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.870 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.870 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.870 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.870 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.870 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.870 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.870 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.870 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.870 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.870 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.870 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.870 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.870 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.870 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.870 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.870 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.870 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.870 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.870 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.870 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.870 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.870 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.870 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.870 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.870 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.870 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.870 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.870 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.870 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.870 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.870 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.870 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.870 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.870 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.870 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.870 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.870 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.870 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.870 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.870 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.870 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.870 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.870 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.871 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.871 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.871 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.871 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.871 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.871 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.871 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.871 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.871 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.871 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.871 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.871 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.871 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.871 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.871 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.871 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.871 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.871 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.871 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.871 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.871 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.871 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.871 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.871 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.871 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.871 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.871 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.871 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.871 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.871 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.871 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.871 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.871 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.871 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.871 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.871 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.871 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.871 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.871 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.871 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.871 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.871 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.871 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.871 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.871 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.871 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.871 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.871 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.871 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.871 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.871 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.871 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.871 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.871 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.871 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.871 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.871 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.871 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.871 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.871 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.871 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.871 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.871 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.871 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.871 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.871 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.871 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.871 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.871 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.871 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.871 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.871 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.871 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.871 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.871 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.871 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.871 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.871 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.871 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.871 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.871 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.871 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.871 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.871 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.871 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.871 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.871 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.871 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.871 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:05:13.871 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:05:13.871 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:05:13.871 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:05:13.871 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:05:13.871 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 1 00:05:13.871 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:13.871 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node=1 00:05:13.871 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:05:13.871 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:13.871 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:13.871 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:05:13.871 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:05:13.871 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:13.871 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:13.872 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.872 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27698404 kB' 'MemFree: 22064436 kB' 'MemUsed: 5633968 kB' 'SwapCached: 0 kB' 'Active: 2985288 kB' 'Inactive: 156824 kB' 'Active(anon): 2845636 kB' 'Inactive(anon): 0 kB' 'Active(file): 139652 kB' 'Inactive(file): 156824 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 2723224 kB' 'Mapped: 17784 kB' 'AnonPages: 418988 kB' 'Shmem: 2426748 kB' 'KernelStack: 9128 kB' 'PageTables: 2892 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 78820 kB' 'Slab: 399144 kB' 'SReclaimable: 78820 kB' 'SUnreclaim: 320324 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:05:13.872 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.872 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.872 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.872 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.872 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.872 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.872 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.872 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.872 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.872 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.872 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.872 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.872 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.872 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.872 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.872 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.872 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.872 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.872 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.872 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.872 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.872 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.872 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.872 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.872 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.872 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.872 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.872 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.872 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.872 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.872 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.872 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.872 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.872 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.872 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.872 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.872 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.872 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.872 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.872 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.872 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.872 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.872 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.872 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.872 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.872 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.872 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.872 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.872 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.872 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.872 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.872 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.872 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.872 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.872 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.872 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.872 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.872 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.872 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.872 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.872 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.872 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.872 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.872 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.872 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.872 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.872 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.872 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.872 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.873 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.873 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.873 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.873 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.873 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.873 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.873 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.873 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.873 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.873 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.873 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.873 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.873 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.873 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.873 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.873 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.873 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.873 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.873 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.873 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.873 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.873 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.873 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.873 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.873 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.873 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.873 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.873 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.873 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.873 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.873 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.873 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.873 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.873 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.873 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.873 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.873 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.873 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.873 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.873 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.873 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.873 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.873 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.873 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.873 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.873 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.873 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.873 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.873 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.873 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.873 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.873 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.873 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.873 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.873 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.873 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.873 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.873 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.873 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.873 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.873 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.873 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.873 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.873 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.873 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.873 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.873 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.873 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.873 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.873 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.873 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.873 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.873 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.873 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:13.873 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.873 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.873 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.873 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:05:13.873 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:05:13.873 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:05:13.873 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:05:13.873 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:05:13.873 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:05:13.873 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # echo 'node0=512 expecting 512' 00:05:13.873 node0=512 expecting 512 00:05:13.873 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:05:13.873 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:05:13.873 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:05:13.873 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # echo 'node1=512 expecting 512' 00:05:13.873 node1=512 expecting 512 00:05:13.873 14:18:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@129 -- # [[ 512 == \5\1\2 ]] 00:05:13.873 00:05:13.873 real 0m3.680s 00:05:13.873 user 0m1.385s 00:05:13.873 sys 0m2.360s 00:05:13.873 14:18:09 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:13.873 14:18:09 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@10 -- # set +x 00:05:13.873 ************************************ 00:05:13.873 END TEST even_2G_alloc 00:05:13.873 ************************************ 00:05:13.873 14:18:09 setup.sh.hugepages -- setup/hugepages.sh@202 -- # run_test odd_alloc odd_alloc 00:05:13.873 14:18:09 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:05:13.873 14:18:09 setup.sh.hugepages -- common/autotest_common.sh@1111 -- # xtrace_disable 00:05:13.873 14:18:09 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:05:13.873 ************************************ 00:05:13.873 START TEST odd_alloc 00:05:13.873 ************************************ 00:05:13.874 14:18:09 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@1129 -- # odd_alloc 00:05:13.874 14:18:09 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@149 -- # get_test_nr_hugepages 2098176 00:05:13.874 14:18:09 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@48 -- # local size=2098176 00:05:13.874 14:18:09 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@49 -- # (( 1 > 1 )) 00:05:13.874 14:18:09 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@54 -- # (( size >= default_hugepages )) 00:05:13.874 14:18:09 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@56 -- # nr_hugepages=1025 00:05:13.874 14:18:09 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@57 -- # get_test_nr_hugepages_per_node 00:05:13.874 14:18:09 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@61 -- # user_nodes=() 00:05:13.874 14:18:09 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@61 -- # local user_nodes 00:05:13.874 14:18:09 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@63 -- # local _nr_hugepages=1025 00:05:13.874 14:18:09 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@64 -- # local _no_nodes=2 00:05:13.874 14:18:09 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@66 -- # nodes_test=() 00:05:13.874 14:18:09 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@66 -- # local -g nodes_test 00:05:13.874 14:18:09 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@68 -- # (( 0 > 0 )) 00:05:13.874 14:18:09 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@73 -- # (( 0 > 0 )) 00:05:13.874 14:18:09 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:05:13.874 14:18:09 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # nodes_test[_no_nodes - 1]=512 00:05:13.874 14:18:09 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@82 -- # : 513 00:05:13.874 14:18:09 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@83 -- # : 1 00:05:13.874 14:18:09 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:05:13.874 14:18:09 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # nodes_test[_no_nodes - 1]=513 00:05:13.874 14:18:09 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@82 -- # : 0 00:05:13.874 14:18:09 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@83 -- # : 0 00:05:13.874 14:18:09 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:05:13.874 14:18:09 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@150 -- # HUGEMEM=2049 00:05:13.874 14:18:09 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@150 -- # setup output 00:05:13.874 14:18:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:05:13.874 14:18:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:17.173 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:05:17.173 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:05:17.173 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:05:17.173 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:05:17.173 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:05:17.173 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:05:17.173 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:05:17.173 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:05:17.173 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:05:17.173 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:05:17.173 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:05:17.173 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:05:17.173 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:05:17.438 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:05:17.438 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:05:17.438 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:05:17.438 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:05:17.438 14:18:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@151 -- # verify_nr_hugepages 00:05:17.438 14:18:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@88 -- # local node 00:05:17.438 14:18:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@89 -- # local sorted_t 00:05:17.438 14:18:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@90 -- # local sorted_s 00:05:17.438 14:18:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@91 -- # local surp 00:05:17.438 14:18:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@92 -- # local resv 00:05:17.438 14:18:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@93 -- # local anon 00:05:17.438 14:18:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@95 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:17.438 14:18:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@96 -- # get_meminfo AnonHugePages 00:05:17.438 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:17.438 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:05:17.438 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:05:17.438 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:17.438 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:17.438 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:17.438 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:17.438 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:17.438 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:17.438 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.438 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.438 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283772 kB' 'MemFree: 40564792 kB' 'MemAvailable: 44292072 kB' 'Buffers: 8940 kB' 'Cached: 13232704 kB' 'SwapCached: 0 kB' 'Active: 10519408 kB' 'Inactive: 3688224 kB' 'Active(anon): 10102616 kB' 'Inactive(anon): 0 kB' 'Active(file): 416792 kB' 'Inactive(file): 3688224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 968764 kB' 'Mapped: 145972 kB' 'Shmem: 9136628 kB' 'KReclaimable: 238260 kB' 'Slab: 872752 kB' 'SReclaimable: 238260 kB' 'SUnreclaim: 634492 kB' 'KernelStack: 21760 kB' 'PageTables: 8200 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37480888 kB' 'Committed_AS: 11624860 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214336 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 425332 kB' 'DirectMap2M: 9746432 kB' 'DirectMap1G: 59768832 kB' 00:05:17.438 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.438 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.438 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.438 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.438 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.438 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.438 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.438 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.438 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.438 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.438 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.438 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.438 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.438 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.438 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.438 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.438 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.438 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.438 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.438 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.438 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.438 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.438 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.438 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.438 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.438 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.438 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.438 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.438 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.438 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.438 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.438 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.438 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.438 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.438 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.438 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.438 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.438 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.438 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.438 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.438 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.438 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.438 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.438 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.438 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.438 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.438 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.438 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.438 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.438 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.438 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.438 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.438 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.438 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.438 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.438 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.438 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.438 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.438 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.438 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.438 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.438 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.438 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.438 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.438 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.438 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.438 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.438 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.439 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.439 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.439 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.439 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.439 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.439 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.439 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.439 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.439 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.439 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.439 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.439 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.439 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.439 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.439 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.439 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.439 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.439 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.439 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.439 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.439 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.439 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.439 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.439 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.439 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.439 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.439 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.439 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.439 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.439 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.439 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.439 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.439 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.439 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.439 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.439 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.439 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.439 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.439 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.439 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.439 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.439 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.439 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.439 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.439 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.439 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.439 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.439 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.439 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.439 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.439 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.439 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.439 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.439 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.439 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.439 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.439 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.439 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.439 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.439 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.439 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.439 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.439 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.439 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.439 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.439 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.439 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.439 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.439 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.439 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.439 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.439 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.439 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.439 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.439 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.439 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.439 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.439 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.439 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.439 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.439 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.439 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.439 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.439 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.439 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.439 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.439 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.439 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.439 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.439 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.439 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.439 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.439 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:17.439 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:05:17.439 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:05:17.439 14:18:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@96 -- # anon=0 00:05:17.439 14:18:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@98 -- # get_meminfo HugePages_Surp 00:05:17.439 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:17.439 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:05:17.439 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:05:17.439 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:17.439 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:17.439 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:17.439 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:17.439 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:17.439 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:17.439 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.439 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283772 kB' 'MemFree: 40568432 kB' 'MemAvailable: 44295656 kB' 'Buffers: 8940 kB' 'Cached: 13232724 kB' 'SwapCached: 0 kB' 'Active: 10518564 kB' 'Inactive: 3688224 kB' 'Active(anon): 10101772 kB' 'Inactive(anon): 0 kB' 'Active(file): 416792 kB' 'Inactive(file): 3688224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 968388 kB' 'Mapped: 145860 kB' 'Shmem: 9136648 kB' 'KReclaimable: 238148 kB' 'Slab: 872596 kB' 'SReclaimable: 238148 kB' 'SUnreclaim: 634448 kB' 'KernelStack: 21728 kB' 'PageTables: 8088 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37480888 kB' 'Committed_AS: 11624876 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214304 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 425332 kB' 'DirectMap2M: 9746432 kB' 'DirectMap1G: 59768832 kB' 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.440 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.441 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.441 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.441 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.441 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.441 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.441 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.441 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.441 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.441 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.441 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.441 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.441 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.441 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.441 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.441 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.441 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.441 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.441 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.441 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.441 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.441 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.441 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.441 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.441 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.441 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.441 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.441 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.441 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.441 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.441 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.441 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.441 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.441 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.441 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.441 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.441 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.441 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.441 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.441 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.441 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.441 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.441 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.441 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.441 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.441 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.441 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.441 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.441 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.441 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.441 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.441 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.441 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.441 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.441 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.441 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.441 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.441 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.441 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.441 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.441 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.441 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.441 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.441 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.441 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.441 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.441 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.441 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.441 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.441 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.441 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.441 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.441 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.441 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.441 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.441 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.441 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.441 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.441 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.441 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.441 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.441 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.441 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.441 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.441 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.441 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.441 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.441 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.441 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.441 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:05:17.441 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:05:17.441 14:18:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@98 -- # surp=0 00:05:17.441 14:18:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Rsvd 00:05:17.441 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:17.441 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:05:17.441 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:05:17.441 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:17.441 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:17.441 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:17.441 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:17.441 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:17.441 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:17.441 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.441 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.441 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283772 kB' 'MemFree: 40571800 kB' 'MemAvailable: 44299024 kB' 'Buffers: 8940 kB' 'Cached: 13232744 kB' 'SwapCached: 0 kB' 'Active: 10518588 kB' 'Inactive: 3688224 kB' 'Active(anon): 10101796 kB' 'Inactive(anon): 0 kB' 'Active(file): 416792 kB' 'Inactive(file): 3688224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 968388 kB' 'Mapped: 145860 kB' 'Shmem: 9136668 kB' 'KReclaimable: 238148 kB' 'Slab: 872596 kB' 'SReclaimable: 238148 kB' 'SUnreclaim: 634448 kB' 'KernelStack: 21728 kB' 'PageTables: 8088 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37480888 kB' 'Committed_AS: 11624896 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214304 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 425332 kB' 'DirectMap2M: 9746432 kB' 'DirectMap1G: 59768832 kB' 00:05:17.441 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.441 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.442 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.442 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.442 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.442 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.442 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.442 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.442 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.442 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.442 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.442 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.442 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.442 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.442 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.442 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.442 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.442 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.442 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.442 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.442 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.442 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.442 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.442 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.442 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.442 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.442 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.442 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.442 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.442 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.442 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.442 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.442 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.442 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.442 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.442 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.442 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.442 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.442 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.442 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.442 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.442 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.442 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.442 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.442 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.442 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.442 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.442 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.442 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.442 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.442 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.442 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.442 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.442 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.442 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.442 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.442 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.442 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.442 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.442 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.442 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.442 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.442 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.442 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.442 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.442 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.442 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.442 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.442 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.442 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.442 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.442 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.442 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.442 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.442 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.442 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.442 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.442 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.442 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.442 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.442 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.442 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.442 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.442 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.442 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.442 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.442 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.442 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.442 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.442 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.442 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.442 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.442 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.442 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.442 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.442 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.442 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.442 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.442 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.442 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.442 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.442 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.442 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.442 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.442 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@99 -- # resv=0 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@101 -- # echo nr_hugepages=1025 00:05:17.443 nr_hugepages=1025 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@102 -- # echo resv_hugepages=0 00:05:17.443 resv_hugepages=0 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@103 -- # echo surplus_hugepages=0 00:05:17.443 surplus_hugepages=0 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@104 -- # echo anon_hugepages=0 00:05:17.443 anon_hugepages=0 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@106 -- # (( 1025 == nr_hugepages + surp + resv )) 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@108 -- # (( 1025 == nr_hugepages )) 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@109 -- # get_meminfo HugePages_Total 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.443 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.444 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283772 kB' 'MemFree: 40572688 kB' 'MemAvailable: 44299912 kB' 'Buffers: 8940 kB' 'Cached: 13232764 kB' 'SwapCached: 0 kB' 'Active: 10518608 kB' 'Inactive: 3688224 kB' 'Active(anon): 10101816 kB' 'Inactive(anon): 0 kB' 'Active(file): 416792 kB' 'Inactive(file): 3688224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 968388 kB' 'Mapped: 145860 kB' 'Shmem: 9136688 kB' 'KReclaimable: 238148 kB' 'Slab: 872596 kB' 'SReclaimable: 238148 kB' 'SUnreclaim: 634448 kB' 'KernelStack: 21728 kB' 'PageTables: 8088 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37480888 kB' 'Committed_AS: 11624916 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214304 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 425332 kB' 'DirectMap2M: 9746432 kB' 'DirectMap1G: 59768832 kB' 00:05:17.444 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.444 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.444 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.444 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.444 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.444 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.444 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.444 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.444 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.444 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.444 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.444 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.444 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.444 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.444 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.444 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.444 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.444 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.444 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.444 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.444 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.444 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.444 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.444 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.444 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.444 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.444 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.444 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.444 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.444 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.444 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.444 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.444 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.444 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.444 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.444 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.444 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.444 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.444 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.444 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.444 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.444 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.444 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.444 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.444 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.444 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.444 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.444 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.444 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.444 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.444 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.444 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.444 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.444 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.444 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.444 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.444 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.444 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.444 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.444 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.444 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.444 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.444 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.709 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.709 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.709 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.709 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.709 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.709 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.709 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.709 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.709 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.709 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.709 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.709 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.709 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.709 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.709 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.709 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.709 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.709 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 1025 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@109 -- # (( 1025 == nr_hugepages + surp + resv )) 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@111 -- # get_nodes 00:05:17.710 14:18:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@26 -- # local node 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=513 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=512 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@31 -- # no_nodes=2 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@32 -- # (( no_nodes > 0 )) 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 0 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node=0 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32585368 kB' 'MemFree: 18502216 kB' 'MemUsed: 14083152 kB' 'SwapCached: 0 kB' 'Active: 7535776 kB' 'Inactive: 3531400 kB' 'Active(anon): 7258636 kB' 'Inactive(anon): 0 kB' 'Active(file): 277140 kB' 'Inactive(file): 3531400 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 10518424 kB' 'Mapped: 128076 kB' 'AnonPages: 551900 kB' 'Shmem: 6709884 kB' 'KernelStack: 12584 kB' 'PageTables: 5144 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 159296 kB' 'Slab: 473452 kB' 'SReclaimable: 159296 kB' 'SUnreclaim: 314156 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 513' 'HugePages_Free: 513' 'HugePages_Surp: 0' 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.711 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 1 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node=1 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27698404 kB' 'MemFree: 22069464 kB' 'MemUsed: 5628940 kB' 'SwapCached: 0 kB' 'Active: 2982816 kB' 'Inactive: 156824 kB' 'Active(anon): 2843164 kB' 'Inactive(anon): 0 kB' 'Active(file): 139652 kB' 'Inactive(file): 156824 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 2723300 kB' 'Mapped: 17784 kB' 'AnonPages: 416488 kB' 'Shmem: 2426824 kB' 'KernelStack: 9144 kB' 'PageTables: 2944 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 78852 kB' 'Slab: 399176 kB' 'SReclaimable: 78852 kB' 'SUnreclaim: 320324 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.712 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # echo 'node0=513 expecting 513' 00:05:17.713 node0=513 expecting 513 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:05:17.713 14:18:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:05:17.714 14:18:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # echo 'node1=512 expecting 512' 00:05:17.714 node1=512 expecting 512 00:05:17.714 14:18:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@129 -- # [[ 512 513 == \5\1\2\ \5\1\3 ]] 00:05:17.714 00:05:17.714 real 0m3.735s 00:05:17.714 user 0m1.371s 00:05:17.714 sys 0m2.431s 00:05:17.714 14:18:13 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:17.714 14:18:13 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@10 -- # set +x 00:05:17.714 ************************************ 00:05:17.714 END TEST odd_alloc 00:05:17.714 ************************************ 00:05:17.714 14:18:13 setup.sh.hugepages -- setup/hugepages.sh@203 -- # run_test custom_alloc custom_alloc 00:05:17.714 14:18:13 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:05:17.714 14:18:13 setup.sh.hugepages -- common/autotest_common.sh@1111 -- # xtrace_disable 00:05:17.714 14:18:13 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:05:17.714 ************************************ 00:05:17.714 START TEST custom_alloc 00:05:17.714 ************************************ 00:05:17.714 14:18:13 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@1129 -- # custom_alloc 00:05:17.714 14:18:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@157 -- # local IFS=, 00:05:17.714 14:18:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@159 -- # local node 00:05:17.714 14:18:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@160 -- # nodes_hp=() 00:05:17.714 14:18:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@160 -- # local nodes_hp 00:05:17.714 14:18:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@162 -- # local nr_hugepages=0 _nr_hugepages=0 00:05:17.714 14:18:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@164 -- # get_test_nr_hugepages 1048576 00:05:17.714 14:18:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@48 -- # local size=1048576 00:05:17.714 14:18:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@49 -- # (( 1 > 1 )) 00:05:17.714 14:18:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@54 -- # (( size >= default_hugepages )) 00:05:17.714 14:18:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@56 -- # nr_hugepages=512 00:05:17.714 14:18:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@57 -- # get_test_nr_hugepages_per_node 00:05:17.714 14:18:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@61 -- # user_nodes=() 00:05:17.714 14:18:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@61 -- # local user_nodes 00:05:17.714 14:18:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@63 -- # local _nr_hugepages=512 00:05:17.714 14:18:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _no_nodes=2 00:05:17.714 14:18:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@66 -- # nodes_test=() 00:05:17.714 14:18:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@66 -- # local -g nodes_test 00:05:17.714 14:18:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@68 -- # (( 0 > 0 )) 00:05:17.714 14:18:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@73 -- # (( 0 > 0 )) 00:05:17.714 14:18:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:05:17.714 14:18:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # nodes_test[_no_nodes - 1]=256 00:05:17.714 14:18:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@82 -- # : 256 00:05:17.714 14:18:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@83 -- # : 1 00:05:17.714 14:18:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:05:17.714 14:18:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # nodes_test[_no_nodes - 1]=256 00:05:17.714 14:18:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@82 -- # : 0 00:05:17.714 14:18:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@83 -- # : 0 00:05:17.714 14:18:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:05:17.714 14:18:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@165 -- # nodes_hp[0]=512 00:05:17.714 14:18:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@166 -- # (( 2 > 1 )) 00:05:17.714 14:18:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@167 -- # get_test_nr_hugepages 2097152 00:05:17.714 14:18:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@48 -- # local size=2097152 00:05:17.714 14:18:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@49 -- # (( 1 > 1 )) 00:05:17.714 14:18:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@54 -- # (( size >= default_hugepages )) 00:05:17.714 14:18:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@56 -- # nr_hugepages=1024 00:05:17.714 14:18:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@57 -- # get_test_nr_hugepages_per_node 00:05:17.714 14:18:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@61 -- # user_nodes=() 00:05:17.714 14:18:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@61 -- # local user_nodes 00:05:17.714 14:18:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@63 -- # local _nr_hugepages=1024 00:05:17.714 14:18:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _no_nodes=2 00:05:17.714 14:18:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@66 -- # nodes_test=() 00:05:17.714 14:18:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@66 -- # local -g nodes_test 00:05:17.714 14:18:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@68 -- # (( 0 > 0 )) 00:05:17.714 14:18:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@73 -- # (( 1 > 0 )) 00:05:17.714 14:18:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # for _no_nodes in "${!nodes_hp[@]}" 00:05:17.714 14:18:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # nodes_test[_no_nodes]=512 00:05:17.714 14:18:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@77 -- # return 0 00:05:17.714 14:18:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@168 -- # nodes_hp[1]=1024 00:05:17.714 14:18:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@171 -- # for node in "${!nodes_hp[@]}" 00:05:17.714 14:18:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@172 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:05:17.714 14:18:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@173 -- # (( _nr_hugepages += nodes_hp[node] )) 00:05:17.714 14:18:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@171 -- # for node in "${!nodes_hp[@]}" 00:05:17.714 14:18:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@172 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:05:17.714 14:18:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@173 -- # (( _nr_hugepages += nodes_hp[node] )) 00:05:17.714 14:18:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@176 -- # get_test_nr_hugepages_per_node 00:05:17.714 14:18:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@61 -- # user_nodes=() 00:05:17.714 14:18:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@61 -- # local user_nodes 00:05:17.714 14:18:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@63 -- # local _nr_hugepages=1024 00:05:17.714 14:18:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _no_nodes=2 00:05:17.714 14:18:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@66 -- # nodes_test=() 00:05:17.714 14:18:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@66 -- # local -g nodes_test 00:05:17.714 14:18:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@68 -- # (( 0 > 0 )) 00:05:17.714 14:18:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@73 -- # (( 2 > 0 )) 00:05:17.714 14:18:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # for _no_nodes in "${!nodes_hp[@]}" 00:05:17.714 14:18:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # nodes_test[_no_nodes]=512 00:05:17.714 14:18:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # for _no_nodes in "${!nodes_hp[@]}" 00:05:17.714 14:18:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # nodes_test[_no_nodes]=1024 00:05:17.714 14:18:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@77 -- # return 0 00:05:17.714 14:18:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@177 -- # HUGENODE='nodes_hp[0]=512,nodes_hp[1]=1024' 00:05:17.714 14:18:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@177 -- # setup output 00:05:17.714 14:18:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:05:17.714 14:18:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:21.015 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:05:21.015 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:05:21.015 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:05:21.016 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:05:21.016 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:05:21.016 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:05:21.016 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:05:21.016 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:05:21.016 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:05:21.016 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:05:21.016 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:05:21.016 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:05:21.016 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:05:21.016 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:05:21.016 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:05:21.016 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:05:21.016 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:05:21.281 14:18:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@178 -- # nr_hugepages=1536 00:05:21.281 14:18:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@178 -- # verify_nr_hugepages 00:05:21.281 14:18:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@88 -- # local node 00:05:21.281 14:18:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@89 -- # local sorted_t 00:05:21.281 14:18:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@90 -- # local sorted_s 00:05:21.281 14:18:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@91 -- # local surp 00:05:21.281 14:18:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@92 -- # local resv 00:05:21.281 14:18:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@93 -- # local anon 00:05:21.281 14:18:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@95 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:21.281 14:18:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@96 -- # get_meminfo AnonHugePages 00:05:21.281 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:21.281 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:05:21.281 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:05:21.281 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:21.281 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:21.281 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:21.281 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:21.281 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:21.281 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:21.281 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.281 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.281 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283772 kB' 'MemFree: 39556140 kB' 'MemAvailable: 43283364 kB' 'Buffers: 8940 kB' 'Cached: 13232868 kB' 'SwapCached: 0 kB' 'Active: 10520992 kB' 'Inactive: 3688224 kB' 'Active(anon): 10104200 kB' 'Inactive(anon): 0 kB' 'Active(file): 416792 kB' 'Inactive(file): 3688224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 970624 kB' 'Mapped: 145936 kB' 'Shmem: 9136792 kB' 'KReclaimable: 238148 kB' 'Slab: 872856 kB' 'SReclaimable: 238148 kB' 'SUnreclaim: 634708 kB' 'KernelStack: 21904 kB' 'PageTables: 8276 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36957624 kB' 'Committed_AS: 11627104 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214480 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 425332 kB' 'DirectMap2M: 9746432 kB' 'DirectMap1G: 59768832 kB' 00:05:21.281 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.281 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.281 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.281 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.281 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.281 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.281 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.281 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.281 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.281 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.281 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.281 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.281 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.281 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.281 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.281 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.281 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.281 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.281 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.281 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.281 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.281 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.281 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.281 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.281 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.281 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.281 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.281 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.281 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.281 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.281 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.281 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.281 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.281 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.281 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.281 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.282 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.283 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.283 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.283 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.283 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.283 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.283 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.283 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.283 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.283 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:05:21.283 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:05:21.283 14:18:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@96 -- # anon=0 00:05:21.283 14:18:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@98 -- # get_meminfo HugePages_Surp 00:05:21.283 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:21.283 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:05:21.283 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:05:21.283 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:21.283 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:21.283 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:21.283 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:21.283 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:21.283 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:21.283 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.283 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.283 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283772 kB' 'MemFree: 39554504 kB' 'MemAvailable: 43281728 kB' 'Buffers: 8940 kB' 'Cached: 13232872 kB' 'SwapCached: 0 kB' 'Active: 10524416 kB' 'Inactive: 3688224 kB' 'Active(anon): 10107624 kB' 'Inactive(anon): 0 kB' 'Active(file): 416792 kB' 'Inactive(file): 3688224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 974092 kB' 'Mapped: 146388 kB' 'Shmem: 9136796 kB' 'KReclaimable: 238148 kB' 'Slab: 872924 kB' 'SReclaimable: 238148 kB' 'SUnreclaim: 634776 kB' 'KernelStack: 21792 kB' 'PageTables: 8284 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36957624 kB' 'Committed_AS: 11632444 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214400 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 425332 kB' 'DirectMap2M: 9746432 kB' 'DirectMap1G: 59768832 kB' 00:05:21.283 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.283 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.283 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.283 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.283 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.283 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.283 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.283 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.283 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.283 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.283 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.283 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.283 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.283 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.283 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.283 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.283 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.283 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.283 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.283 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.283 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.283 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.283 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.283 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.283 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.283 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.283 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.283 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.283 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.283 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.283 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.283 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.283 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.283 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.283 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.283 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.283 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.283 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.283 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.283 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.283 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.283 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.283 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.283 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.283 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.283 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.283 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.283 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.283 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.283 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.283 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.283 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.283 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.283 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.283 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.283 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.283 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.283 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.283 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.283 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.283 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.283 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.283 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.283 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.283 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.283 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.283 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.283 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.283 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.283 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.283 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.283 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.283 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.283 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.283 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.283 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.283 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.283 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.283 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.284 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.285 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.285 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.285 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.285 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.285 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.285 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.285 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.285 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.285 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.285 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:05:21.285 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:05:21.285 14:18:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@98 -- # surp=0 00:05:21.285 14:18:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Rsvd 00:05:21.285 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:21.285 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:05:21.285 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:05:21.285 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:21.285 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:21.285 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:21.285 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:21.285 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:21.285 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:21.285 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.285 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.285 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283772 kB' 'MemFree: 39556768 kB' 'MemAvailable: 43283992 kB' 'Buffers: 8940 kB' 'Cached: 13232888 kB' 'SwapCached: 0 kB' 'Active: 10521264 kB' 'Inactive: 3688224 kB' 'Active(anon): 10104472 kB' 'Inactive(anon): 0 kB' 'Active(file): 416792 kB' 'Inactive(file): 3688224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 970892 kB' 'Mapped: 146724 kB' 'Shmem: 9136812 kB' 'KReclaimable: 238148 kB' 'Slab: 872920 kB' 'SReclaimable: 238148 kB' 'SUnreclaim: 634772 kB' 'KernelStack: 21808 kB' 'PageTables: 8652 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36957624 kB' 'Committed_AS: 11629296 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214416 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 425332 kB' 'DirectMap2M: 9746432 kB' 'DirectMap1G: 59768832 kB' 00:05:21.285 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.285 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.285 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.285 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.285 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.285 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.285 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.285 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.285 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.285 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.285 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.285 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.285 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.285 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.285 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.285 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.285 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.285 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.285 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.285 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.285 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.285 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.285 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.285 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.285 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.285 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.285 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.285 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.285 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.285 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.285 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.285 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.285 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.285 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.285 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.285 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.285 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.285 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.285 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.285 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.285 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.285 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.285 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.285 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.285 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.285 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.285 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.285 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.285 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.285 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.285 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.285 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.285 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.285 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.285 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.285 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.285 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.285 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.285 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.285 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.285 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.285 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.285 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.285 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.285 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.285 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.285 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.285 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.285 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.285 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.285 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.285 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.285 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.285 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.285 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.285 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.285 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.285 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.286 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.286 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.286 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.286 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.286 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.286 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.286 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.286 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.286 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.286 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.286 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.286 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.286 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.286 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.286 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.286 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.286 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.286 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.286 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.286 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.286 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.286 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.286 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.286 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.286 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.286 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.286 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.286 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.286 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.286 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.286 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.286 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.286 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.286 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.286 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.286 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.286 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.286 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.286 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.286 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.286 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.286 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.286 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.286 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.286 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.286 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.286 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.286 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.286 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.286 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.286 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.286 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.286 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.286 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.286 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.286 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.286 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.286 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.286 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.286 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.286 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.286 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.286 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.286 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.286 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.286 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.286 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.286 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.286 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.286 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.286 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.286 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.286 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.286 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.286 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.286 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.286 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.286 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.286 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.286 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.286 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.286 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.286 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.286 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.286 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.286 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.286 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.286 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.286 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.286 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.286 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.286 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.286 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.286 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.286 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.286 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.286 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.287 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.287 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.287 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.287 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.287 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.287 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.287 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.287 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.287 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.287 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.287 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.287 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.287 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.287 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.287 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.287 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.287 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.287 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.287 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.287 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.287 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.287 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.287 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.287 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.287 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.287 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.287 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:05:21.287 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:05:21.287 14:18:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@99 -- # resv=0 00:05:21.287 14:18:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@101 -- # echo nr_hugepages=1536 00:05:21.287 nr_hugepages=1536 00:05:21.287 14:18:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@102 -- # echo resv_hugepages=0 00:05:21.287 resv_hugepages=0 00:05:21.287 14:18:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@103 -- # echo surplus_hugepages=0 00:05:21.287 surplus_hugepages=0 00:05:21.287 14:18:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@104 -- # echo anon_hugepages=0 00:05:21.287 anon_hugepages=0 00:05:21.287 14:18:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@106 -- # (( 1536 == nr_hugepages + surp + resv )) 00:05:21.287 14:18:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@108 -- # (( 1536 == nr_hugepages )) 00:05:21.287 14:18:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@109 -- # get_meminfo HugePages_Total 00:05:21.287 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:21.287 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:05:21.287 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:05:21.287 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:21.287 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:21.287 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:21.287 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:21.287 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:21.287 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:21.287 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.287 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283772 kB' 'MemFree: 39547060 kB' 'MemAvailable: 43274284 kB' 'Buffers: 8940 kB' 'Cached: 13232912 kB' 'SwapCached: 0 kB' 'Active: 10526344 kB' 'Inactive: 3688224 kB' 'Active(anon): 10109552 kB' 'Inactive(anon): 0 kB' 'Active(file): 416792 kB' 'Inactive(file): 3688224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 975924 kB' 'Mapped: 146388 kB' 'Shmem: 9136836 kB' 'KReclaimable: 238148 kB' 'Slab: 872912 kB' 'SReclaimable: 238148 kB' 'SUnreclaim: 634764 kB' 'KernelStack: 21744 kB' 'PageTables: 8092 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36957624 kB' 'Committed_AS: 11634348 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214388 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 425332 kB' 'DirectMap2M: 9746432 kB' 'DirectMap1G: 59768832 kB' 00:05:21.287 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.287 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.287 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.287 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.287 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.287 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.287 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.287 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.287 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.287 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.287 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.287 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.287 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.287 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.287 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.287 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.287 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.287 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.287 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.287 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.287 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.287 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.287 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.287 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.287 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.287 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.287 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.287 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.287 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.287 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.287 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.287 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.287 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.287 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.287 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.287 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.287 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.287 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.287 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.287 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.287 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.287 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.287 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.287 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.287 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.287 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.287 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.287 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.287 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.287 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.287 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.287 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.287 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.287 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.288 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.289 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.289 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.289 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.289 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.289 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.289 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.289 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.289 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.289 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.289 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.289 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.289 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.289 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.289 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.289 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.289 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.289 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.289 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.289 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.289 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.289 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.289 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.289 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.289 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 1536 00:05:21.289 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:05:21.289 14:18:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@109 -- # (( 1536 == nr_hugepages + surp + resv )) 00:05:21.289 14:18:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@111 -- # get_nodes 00:05:21.289 14:18:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@26 -- # local node 00:05:21.289 14:18:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:21.289 14:18:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=512 00:05:21.289 14:18:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:21.289 14:18:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=1024 00:05:21.289 14:18:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@31 -- # no_nodes=2 00:05:21.289 14:18:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@32 -- # (( no_nodes > 0 )) 00:05:21.289 14:18:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:05:21.289 14:18:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:05:21.289 14:18:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 0 00:05:21.289 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:21.289 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node=0 00:05:21.289 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:05:21.289 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:21.289 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:21.289 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:21.552 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:21.552 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:21.552 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:21.552 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.552 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.552 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32585368 kB' 'MemFree: 18532284 kB' 'MemUsed: 14053084 kB' 'SwapCached: 0 kB' 'Active: 7538268 kB' 'Inactive: 3531400 kB' 'Active(anon): 7261128 kB' 'Inactive(anon): 0 kB' 'Active(file): 277140 kB' 'Inactive(file): 3531400 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 10518524 kB' 'Mapped: 128336 kB' 'AnonPages: 554304 kB' 'Shmem: 6709984 kB' 'KernelStack: 12600 kB' 'PageTables: 5144 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 159296 kB' 'Slab: 473700 kB' 'SReclaimable: 159296 kB' 'SUnreclaim: 314404 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:05:21.552 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.552 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.552 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.552 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.552 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.552 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.552 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.552 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.552 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.552 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.552 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.552 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.552 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.552 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.552 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.552 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.552 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.552 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.552 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.552 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.552 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.552 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.552 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.552 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.552 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.552 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.552 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.552 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.552 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.552 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.552 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.552 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.552 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.552 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.552 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.552 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.552 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.552 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.552 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.552 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.552 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.552 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.552 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.552 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.552 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.552 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.552 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.552 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.552 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.552 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.552 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.552 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.552 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.552 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.552 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.552 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.552 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.552 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.552 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.552 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.552 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.552 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.552 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.552 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.552 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.552 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.552 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.552 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.552 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.552 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.552 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.552 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.552 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.552 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.552 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.552 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.552 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.552 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.552 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.552 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.552 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.552 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.552 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.552 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.552 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.552 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.552 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.552 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.553 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.553 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.553 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.553 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.553 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.553 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.553 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.553 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.553 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.553 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.553 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.553 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.553 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.553 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.553 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.553 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.553 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.553 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.553 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.553 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.553 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.553 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.553 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.553 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.553 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.553 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.553 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.553 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.553 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.553 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.553 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.553 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.553 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.553 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.553 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.553 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.553 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.553 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.553 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.553 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.553 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.553 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.553 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.553 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.553 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.553 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.553 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.553 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.553 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.553 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.553 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.553 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.553 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.553 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.553 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.553 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.553 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.553 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:05:21.553 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:05:21.553 14:18:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:05:21.553 14:18:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:05:21.553 14:18:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:05:21.553 14:18:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 1 00:05:21.553 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:21.553 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node=1 00:05:21.553 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:05:21.553 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:21.553 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:21.553 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:05:21.553 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:05:21.553 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:21.553 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:21.553 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.553 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.553 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27698404 kB' 'MemFree: 21020172 kB' 'MemUsed: 6678232 kB' 'SwapCached: 0 kB' 'Active: 2981816 kB' 'Inactive: 156824 kB' 'Active(anon): 2842164 kB' 'Inactive(anon): 0 kB' 'Active(file): 139652 kB' 'Inactive(file): 156824 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 2723364 kB' 'Mapped: 17784 kB' 'AnonPages: 415380 kB' 'Shmem: 2426888 kB' 'KernelStack: 9096 kB' 'PageTables: 2936 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 78852 kB' 'Slab: 399300 kB' 'SReclaimable: 78852 kB' 'SUnreclaim: 320448 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:05:21.553 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.553 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.553 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.553 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.553 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.553 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.553 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.553 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.553 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.553 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.553 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.553 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.553 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.553 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.553 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.553 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.553 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.553 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.553 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.553 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.553 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.553 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.553 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.553 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.553 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.553 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.553 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.553 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.553 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.553 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.553 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.553 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.553 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:05:21.554 14:18:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:05:21.555 14:18:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:05:21.555 14:18:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:05:21.555 14:18:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # echo 'node0=512 expecting 512' 00:05:21.555 node0=512 expecting 512 00:05:21.555 14:18:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:05:21.555 14:18:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:05:21.555 14:18:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:05:21.555 14:18:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # echo 'node1=1024 expecting 1024' 00:05:21.555 node1=1024 expecting 1024 00:05:21.555 14:18:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@129 -- # [[ 512,1024 == \5\1\2\,\1\0\2\4 ]] 00:05:21.555 00:05:21.555 real 0m3.767s 00:05:21.555 user 0m1.434s 00:05:21.555 sys 0m2.404s 00:05:21.555 14:18:17 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:21.555 14:18:17 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@10 -- # set +x 00:05:21.555 ************************************ 00:05:21.555 END TEST custom_alloc 00:05:21.555 ************************************ 00:05:21.555 14:18:17 setup.sh.hugepages -- setup/hugepages.sh@204 -- # run_test no_shrink_alloc no_shrink_alloc 00:05:21.555 14:18:17 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:05:21.555 14:18:17 setup.sh.hugepages -- common/autotest_common.sh@1111 -- # xtrace_disable 00:05:21.555 14:18:17 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:05:21.555 ************************************ 00:05:21.555 START TEST no_shrink_alloc 00:05:21.555 ************************************ 00:05:21.555 14:18:17 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@1129 -- # no_shrink_alloc 00:05:21.555 14:18:17 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@185 -- # get_test_nr_hugepages 2097152 0 00:05:21.555 14:18:17 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@48 -- # local size=2097152 00:05:21.555 14:18:17 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@49 -- # (( 2 > 1 )) 00:05:21.555 14:18:17 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@50 -- # shift 00:05:21.555 14:18:17 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@51 -- # node_ids=('0') 00:05:21.555 14:18:17 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@51 -- # local node_ids 00:05:21.555 14:18:17 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@54 -- # (( size >= default_hugepages )) 00:05:21.555 14:18:17 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@56 -- # nr_hugepages=1024 00:05:21.555 14:18:17 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@57 -- # get_test_nr_hugepages_per_node 0 00:05:21.555 14:18:17 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@61 -- # user_nodes=('0') 00:05:21.555 14:18:17 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@61 -- # local user_nodes 00:05:21.555 14:18:17 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@63 -- # local _nr_hugepages=1024 00:05:21.555 14:18:17 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@64 -- # local _no_nodes=2 00:05:21.555 14:18:17 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@66 -- # nodes_test=() 00:05:21.555 14:18:17 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@66 -- # local -g nodes_test 00:05:21.555 14:18:17 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@68 -- # (( 1 > 0 )) 00:05:21.555 14:18:17 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@69 -- # for _no_nodes in "${user_nodes[@]}" 00:05:21.555 14:18:17 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@70 -- # nodes_test[_no_nodes]=1024 00:05:21.555 14:18:17 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@72 -- # return 0 00:05:21.555 14:18:17 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@188 -- # NRHUGE=1024 00:05:21.555 14:18:17 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@188 -- # HUGENODE=0 00:05:21.555 14:18:17 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@188 -- # setup output 00:05:21.555 14:18:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:05:21.555 14:18:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:24.856 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:05:24.856 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:05:24.856 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:05:24.856 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:05:24.856 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:05:24.856 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:05:24.856 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:05:24.856 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:05:24.856 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:05:24.856 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:05:24.856 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:05:24.856 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:05:24.856 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:05:24.856 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:05:24.856 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:05:24.856 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:05:24.856 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:05:25.123 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@189 -- # verify_nr_hugepages 00:05:25.123 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@88 -- # local node 00:05:25.123 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@89 -- # local sorted_t 00:05:25.123 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@90 -- # local sorted_s 00:05:25.123 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@91 -- # local surp 00:05:25.123 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@92 -- # local resv 00:05:25.123 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@93 -- # local anon 00:05:25.123 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@95 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:25.123 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # get_meminfo AnonHugePages 00:05:25.123 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:25.123 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:25.123 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:25.123 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:25.123 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:25.123 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:25.123 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:25.123 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:25.123 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:25.123 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.123 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.123 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283772 kB' 'MemFree: 40596208 kB' 'MemAvailable: 44323432 kB' 'Buffers: 8940 kB' 'Cached: 13233044 kB' 'SwapCached: 0 kB' 'Active: 10521956 kB' 'Inactive: 3688224 kB' 'Active(anon): 10105164 kB' 'Inactive(anon): 0 kB' 'Active(file): 416792 kB' 'Inactive(file): 3688224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 971576 kB' 'Mapped: 145944 kB' 'Shmem: 9136968 kB' 'KReclaimable: 238148 kB' 'Slab: 872980 kB' 'SReclaimable: 238148 kB' 'SUnreclaim: 634832 kB' 'KernelStack: 21808 kB' 'PageTables: 8364 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481912 kB' 'Committed_AS: 11633000 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214384 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 425332 kB' 'DirectMap2M: 9746432 kB' 'DirectMap1G: 59768832 kB' 00:05:25.123 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.123 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.123 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.123 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.123 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.123 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.123 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.123 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.123 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.123 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.123 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.123 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.123 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.123 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.123 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.123 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.123 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.123 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.123 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.123 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.123 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.123 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.123 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.123 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.123 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.123 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.123 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.123 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.123 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.123 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.123 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.123 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.123 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.123 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.123 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.123 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.123 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.123 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.123 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.123 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.123 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.123 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.123 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.123 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.123 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.123 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.123 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.123 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.123 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.123 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.123 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.123 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.123 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.123 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.123 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.123 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.123 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.123 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.123 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.123 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.123 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.123 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.123 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.123 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.123 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.123 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.123 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.123 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.123 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.123 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.123 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.123 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.123 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.124 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.124 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.124 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.124 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.124 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.124 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.124 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.124 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.124 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.124 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.124 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.124 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.124 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.124 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.124 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.124 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.124 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.124 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.124 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.124 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.124 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.124 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.124 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.124 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.124 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.124 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.124 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.124 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.124 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.124 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.124 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.124 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.124 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.124 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.124 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.124 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.124 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.124 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.124 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.124 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.124 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.124 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.124 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.124 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.124 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.124 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.124 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.124 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.124 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.124 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.124 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.124 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.124 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.124 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.124 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.124 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.124 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.124 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.124 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.124 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.124 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.124 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.124 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.124 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.124 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.124 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.124 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.124 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.124 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.124 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.124 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.124 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.124 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.124 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.124 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.124 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.124 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.124 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.124 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.124 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.124 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.124 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.124 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.124 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.124 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.124 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.124 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.124 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:25.124 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:25.124 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:25.124 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # anon=0 00:05:25.124 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@98 -- # get_meminfo HugePages_Surp 00:05:25.124 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:25.124 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:25.124 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:25.124 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:25.124 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:25.124 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:25.124 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:25.124 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:25.124 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:25.124 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.124 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.125 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283772 kB' 'MemFree: 40596436 kB' 'MemAvailable: 44323660 kB' 'Buffers: 8940 kB' 'Cached: 13233060 kB' 'SwapCached: 0 kB' 'Active: 10520128 kB' 'Inactive: 3688224 kB' 'Active(anon): 10103336 kB' 'Inactive(anon): 0 kB' 'Active(file): 416792 kB' 'Inactive(file): 3688224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 969644 kB' 'Mapped: 145888 kB' 'Shmem: 9136984 kB' 'KReclaimable: 238148 kB' 'Slab: 873012 kB' 'SReclaimable: 238148 kB' 'SUnreclaim: 634864 kB' 'KernelStack: 21680 kB' 'PageTables: 7936 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481912 kB' 'Committed_AS: 11626244 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214304 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 425332 kB' 'DirectMap2M: 9746432 kB' 'DirectMap1G: 59768832 kB' 00:05:25.125 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.125 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.125 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.125 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.125 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.125 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.125 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.125 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.125 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.125 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.125 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.125 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.125 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.125 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.125 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.125 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.125 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.125 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.125 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.125 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.125 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.125 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.125 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.125 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.125 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.125 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.125 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.125 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.125 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.125 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.125 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.125 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.125 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.125 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.125 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.125 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.125 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.125 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.125 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.125 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.125 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.125 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.125 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.125 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.125 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.125 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.125 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.125 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.125 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.125 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.125 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.125 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.125 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.125 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.125 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.125 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.125 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.125 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.125 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.125 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.125 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.125 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.125 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.125 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.125 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.125 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.125 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.125 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.125 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.125 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.125 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.125 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.125 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.125 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.125 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.125 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.125 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.125 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.125 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.125 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.125 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.125 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.125 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.125 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.125 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.125 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.125 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.125 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.125 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.125 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.125 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.125 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.125 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.125 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.125 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.125 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.125 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.125 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.125 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.125 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.125 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@98 -- # surp=0 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Rsvd 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:25.126 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:25.127 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:25.127 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:25.127 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:25.127 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:25.127 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.127 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.127 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283772 kB' 'MemFree: 40596436 kB' 'MemAvailable: 44323660 kB' 'Buffers: 8940 kB' 'Cached: 13233064 kB' 'SwapCached: 0 kB' 'Active: 10520752 kB' 'Inactive: 3688224 kB' 'Active(anon): 10103960 kB' 'Inactive(anon): 0 kB' 'Active(file): 416792 kB' 'Inactive(file): 3688224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 970316 kB' 'Mapped: 145888 kB' 'Shmem: 9136988 kB' 'KReclaimable: 238148 kB' 'Slab: 873012 kB' 'SReclaimable: 238148 kB' 'SUnreclaim: 634864 kB' 'KernelStack: 21744 kB' 'PageTables: 8144 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481912 kB' 'Committed_AS: 11626264 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214288 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 425332 kB' 'DirectMap2M: 9746432 kB' 'DirectMap1G: 59768832 kB' 00:05:25.127 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.127 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.127 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.127 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.127 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.127 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.127 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.127 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.127 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.127 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.127 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.127 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.127 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.127 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.127 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.127 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.127 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.127 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.127 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.127 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.127 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.127 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.127 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.127 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.127 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.127 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.127 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.127 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.127 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.127 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.127 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.127 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.127 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.127 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.127 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.127 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.127 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.127 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.127 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.127 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.127 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.127 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.127 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.127 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.127 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.127 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.127 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.127 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.127 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.127 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.127 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.127 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.127 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.127 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.127 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.127 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.127 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.127 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.127 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.127 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.127 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.127 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.127 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.127 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.127 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.127 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.127 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.127 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.127 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.127 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.127 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.127 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.127 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.127 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.127 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.127 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.127 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.127 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.127 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.127 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.127 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.128 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.128 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.128 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.128 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.128 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.128 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.128 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.128 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.128 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.128 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.128 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.128 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.128 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.128 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.128 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.128 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.128 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.128 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.128 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.128 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.128 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.128 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.128 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.128 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.128 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.128 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.128 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.128 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.128 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.128 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.128 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.128 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.128 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.128 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.128 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.128 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.128 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.128 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.128 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.128 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.128 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.128 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.128 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.128 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.128 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.128 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.128 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.128 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.128 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.128 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.128 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.128 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.128 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.128 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.128 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.128 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.128 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.128 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.128 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.128 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.128 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.128 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.128 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.128 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.128 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.128 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.128 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.128 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.128 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.128 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.128 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.128 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.128 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.128 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.128 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.128 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.128 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.128 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.128 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.128 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.128 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.128 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.128 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.128 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.128 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.128 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.128 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.128 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.128 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.128 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.128 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.128 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.128 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.128 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.129 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.129 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.129 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.129 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.129 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.129 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.129 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.129 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.129 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.129 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.129 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.129 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.129 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.129 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.129 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.129 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.129 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.129 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.129 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.129 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.129 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.129 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.129 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.129 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.129 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.129 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:25.129 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:25.129 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:25.129 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # resv=0 00:05:25.129 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@101 -- # echo nr_hugepages=1024 00:05:25.129 nr_hugepages=1024 00:05:25.129 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@102 -- # echo resv_hugepages=0 00:05:25.129 resv_hugepages=0 00:05:25.129 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@103 -- # echo surplus_hugepages=0 00:05:25.129 surplus_hugepages=0 00:05:25.129 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@104 -- # echo anon_hugepages=0 00:05:25.129 anon_hugepages=0 00:05:25.129 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@106 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:25.129 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@108 -- # (( 1024 == nr_hugepages )) 00:05:25.129 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # get_meminfo HugePages_Total 00:05:25.129 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:25.129 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:25.129 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:25.129 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:25.129 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:25.129 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:25.129 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:25.129 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:25.129 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:25.129 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283772 kB' 'MemFree: 40594924 kB' 'MemAvailable: 44322148 kB' 'Buffers: 8940 kB' 'Cached: 13233088 kB' 'SwapCached: 0 kB' 'Active: 10521884 kB' 'Inactive: 3688224 kB' 'Active(anon): 10105092 kB' 'Inactive(anon): 0 kB' 'Active(file): 416792 kB' 'Inactive(file): 3688224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 971476 kB' 'Mapped: 145888 kB' 'Shmem: 9137012 kB' 'KReclaimable: 238148 kB' 'Slab: 873012 kB' 'SReclaimable: 238148 kB' 'SUnreclaim: 634864 kB' 'KernelStack: 21760 kB' 'PageTables: 8244 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481912 kB' 'Committed_AS: 11641368 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214272 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 425332 kB' 'DirectMap2M: 9746432 kB' 'DirectMap1G: 59768832 kB' 00:05:25.129 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.129 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.129 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.129 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.129 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.129 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.129 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.129 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.129 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.129 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.129 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.129 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.129 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.129 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.129 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.129 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.129 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.129 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.129 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.129 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.129 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.129 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.129 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.129 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.129 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.129 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.129 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.129 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.129 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.129 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.129 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.129 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.129 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.129 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.129 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.129 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.129 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.129 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.129 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.129 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.129 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.129 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.129 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.129 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.129 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.129 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.129 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.129 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.129 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.129 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.129 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.129 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.129 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.130 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.131 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.131 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.131 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.131 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.131 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.131 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.131 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.131 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.131 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.131 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.131 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.131 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.131 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.131 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.131 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.131 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.131 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.131 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.131 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.131 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.131 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.131 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.131 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.131 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.131 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.131 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.131 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.131 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.131 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:25.131 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 1024 00:05:25.131 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:25.131 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:25.131 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@111 -- # get_nodes 00:05:25.131 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@26 -- # local node 00:05:25.131 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:25.131 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=1024 00:05:25.131 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:25.131 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=0 00:05:25.131 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@31 -- # no_nodes=2 00:05:25.131 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@32 -- # (( no_nodes > 0 )) 00:05:25.131 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:05:25.131 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:05:25.393 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 0 00:05:25.393 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:25.393 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node=0 00:05:25.393 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:25.393 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:25.394 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:25.394 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:25.394 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:25.394 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:25.394 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:25.394 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.394 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.394 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32585368 kB' 'MemFree: 17488496 kB' 'MemUsed: 15096872 kB' 'SwapCached: 0 kB' 'Active: 7539772 kB' 'Inactive: 3531400 kB' 'Active(anon): 7262632 kB' 'Inactive(anon): 0 kB' 'Active(file): 277140 kB' 'Inactive(file): 3531400 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 10518572 kB' 'Mapped: 128104 kB' 'AnonPages: 555784 kB' 'Shmem: 6710032 kB' 'KernelStack: 12616 kB' 'PageTables: 5196 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 159296 kB' 'Slab: 473484 kB' 'SReclaimable: 159296 kB' 'SUnreclaim: 314188 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:05:25.394 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.394 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.394 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.394 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.394 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.394 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.394 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.394 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.394 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.394 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.394 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.394 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.394 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.394 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.394 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.394 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.394 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.394 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.394 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.394 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.394 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.394 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.394 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.394 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.394 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.394 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.394 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.394 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.394 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.394 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.394 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.394 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.394 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.394 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.394 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.394 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.394 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.394 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.394 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.394 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.394 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.394 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.394 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.394 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.394 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.394 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.394 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.394 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.394 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.394 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.394 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.394 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.394 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.394 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.394 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.394 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.394 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.394 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.394 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.394 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.394 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.394 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.394 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.394 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.394 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.394 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.394 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.394 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.394 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.394 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.394 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.394 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.394 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.394 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.394 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.394 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.394 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.394 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.394 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.394 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.394 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.394 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.394 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.394 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.394 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.394 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.394 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.394 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.394 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.394 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.394 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.394 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.394 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.394 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.394 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.395 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.395 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.395 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.395 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.395 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.395 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.395 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.395 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.395 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.395 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.395 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.395 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.395 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.395 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.395 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.395 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.395 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.395 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.395 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.395 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.395 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.395 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.395 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.395 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.395 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.395 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.395 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.395 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.395 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.395 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.395 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.395 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.395 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.395 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.395 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.395 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.395 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.395 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.395 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.395 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.395 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.395 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.395 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.395 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.395 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.395 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.395 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:25.395 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:25.395 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:25.395 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:25.395 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:25.395 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:25.395 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:05:25.395 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:05:25.395 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:05:25.395 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:05:25.395 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # echo 'node0=1024 expecting 1024' 00:05:25.395 node0=1024 expecting 1024 00:05:25.395 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@129 -- # [[ 1024 == \1\0\2\4 ]] 00:05:25.395 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@192 -- # CLEAR_HUGE=no 00:05:25.395 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@192 -- # NRHUGE=512 00:05:25.395 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@192 -- # HUGENODE=0 00:05:25.395 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@192 -- # setup output 00:05:25.395 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:05:25.395 14:18:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:28.701 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:05:28.701 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:05:28.701 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:05:28.701 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:05:28.701 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:05:28.701 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:05:28.701 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:05:28.701 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:05:28.701 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:05:28.701 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:05:28.701 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:05:28.701 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:05:28.701 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:05:28.701 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:05:28.701 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:05:28.701 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:05:28.701 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:05:28.701 INFO: Requested 512 hugepages but 1024 already allocated on node0 00:05:28.701 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@194 -- # verify_nr_hugepages 00:05:28.701 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@88 -- # local node 00:05:28.701 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@89 -- # local sorted_t 00:05:28.701 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@90 -- # local sorted_s 00:05:28.701 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@91 -- # local surp 00:05:28.701 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@92 -- # local resv 00:05:28.701 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@93 -- # local anon 00:05:28.701 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@95 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:28.701 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # get_meminfo AnonHugePages 00:05:28.701 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:28.701 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:28.701 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:28.701 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:28.701 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:28.701 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:28.701 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:28.701 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:28.701 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:28.701 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.701 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.701 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283772 kB' 'MemFree: 40582232 kB' 'MemAvailable: 44309456 kB' 'Buffers: 8940 kB' 'Cached: 13233196 kB' 'SwapCached: 0 kB' 'Active: 10521696 kB' 'Inactive: 3688224 kB' 'Active(anon): 10104904 kB' 'Inactive(anon): 0 kB' 'Active(file): 416792 kB' 'Inactive(file): 3688224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 971016 kB' 'Mapped: 145904 kB' 'Shmem: 9137120 kB' 'KReclaimable: 238148 kB' 'Slab: 872992 kB' 'SReclaimable: 238148 kB' 'SUnreclaim: 634844 kB' 'KernelStack: 21696 kB' 'PageTables: 7996 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481912 kB' 'Committed_AS: 11627916 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214320 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 425332 kB' 'DirectMap2M: 9746432 kB' 'DirectMap1G: 59768832 kB' 00:05:28.701 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:28.701 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.701 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.701 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.701 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:28.701 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.701 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.701 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.701 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:28.701 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.701 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.701 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:28.702 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.703 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.703 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.703 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:28.703 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.703 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.703 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.703 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:28.703 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.703 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.703 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.703 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:28.703 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.703 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.703 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.703 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:28.703 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.703 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.703 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.703 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:28.703 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.703 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.703 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.703 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:28.703 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.703 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.703 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.703 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:28.703 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.703 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.703 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.703 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:28.703 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.703 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.703 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.703 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:28.703 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:28.703 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:28.703 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # anon=0 00:05:28.703 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@98 -- # get_meminfo HugePages_Surp 00:05:28.703 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:28.703 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:28.703 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:28.703 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:28.703 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:28.703 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:28.703 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:28.703 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:28.703 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:28.703 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.703 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.703 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283772 kB' 'MemFree: 40583924 kB' 'MemAvailable: 44311148 kB' 'Buffers: 8940 kB' 'Cached: 13233212 kB' 'SwapCached: 0 kB' 'Active: 10522636 kB' 'Inactive: 3688224 kB' 'Active(anon): 10105844 kB' 'Inactive(anon): 0 kB' 'Active(file): 416792 kB' 'Inactive(file): 3688224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 972044 kB' 'Mapped: 145904 kB' 'Shmem: 9137136 kB' 'KReclaimable: 238148 kB' 'Slab: 872972 kB' 'SReclaimable: 238148 kB' 'SUnreclaim: 634824 kB' 'KernelStack: 21808 kB' 'PageTables: 7964 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481912 kB' 'Committed_AS: 11628304 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214320 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 425332 kB' 'DirectMap2M: 9746432 kB' 'DirectMap1G: 59768832 kB' 00:05:28.703 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.703 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.703 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.703 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.703 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.703 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.703 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.703 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.703 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.703 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.703 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.703 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.703 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.703 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.703 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.703 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.703 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.703 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.703 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.703 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.703 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.703 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.703 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.703 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.703 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.703 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.703 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.703 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.703 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.703 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.703 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.703 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.703 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.703 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.703 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.703 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.703 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.703 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.703 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.703 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.703 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.703 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.703 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.703 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.703 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.703 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.703 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.703 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.703 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.703 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.704 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.705 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.705 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.705 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.705 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.705 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.705 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.705 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.705 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.705 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.705 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.705 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.970 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.970 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.970 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.970 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.970 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.970 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.970 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.970 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.970 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.970 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.970 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.970 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.970 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.970 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.970 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.970 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.970 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.970 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.970 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.970 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.970 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.970 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.970 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.970 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.970 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.970 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.970 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.970 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.970 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.970 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.970 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:28.970 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:28.970 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@98 -- # surp=0 00:05:28.970 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Rsvd 00:05:28.970 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:28.970 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:28.970 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:28.970 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:28.970 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:28.970 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:28.970 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:28.970 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:28.970 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:28.970 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283772 kB' 'MemFree: 40586932 kB' 'MemAvailable: 44314156 kB' 'Buffers: 8940 kB' 'Cached: 13233240 kB' 'SwapCached: 0 kB' 'Active: 10522868 kB' 'Inactive: 3688224 kB' 'Active(anon): 10106076 kB' 'Inactive(anon): 0 kB' 'Active(file): 416792 kB' 'Inactive(file): 3688224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 972220 kB' 'Mapped: 145904 kB' 'Shmem: 9137164 kB' 'KReclaimable: 238148 kB' 'Slab: 872972 kB' 'SReclaimable: 238148 kB' 'SUnreclaim: 634824 kB' 'KernelStack: 21808 kB' 'PageTables: 8096 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481912 kB' 'Committed_AS: 11629948 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214384 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 425332 kB' 'DirectMap2M: 9746432 kB' 'DirectMap1G: 59768832 kB' 00:05:28.970 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.971 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:28.972 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.972 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.972 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.972 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:28.972 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.972 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.972 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.972 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:28.972 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.972 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.972 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.972 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:28.972 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.972 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.972 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.972 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:28.972 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.972 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.972 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.972 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:28.972 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.972 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.972 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.972 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:28.972 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.972 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.972 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.972 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:28.972 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.972 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.972 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.972 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:28.972 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.972 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.972 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.972 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:28.972 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.972 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.972 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.972 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:28.972 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.972 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.972 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.972 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:28.972 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.972 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.972 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.972 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:28.972 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.972 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.972 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.972 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:28.972 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.972 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.972 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.972 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:28.972 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.972 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.972 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.972 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:28.972 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.972 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.972 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.972 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:28.972 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.972 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.972 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.972 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:28.972 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.972 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.972 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.972 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:28.972 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.972 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.972 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.972 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:28.972 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.972 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.972 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.972 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:28.972 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.972 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.972 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.972 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:28.972 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.972 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.972 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.972 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:28.972 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:28.972 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:28.972 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # resv=0 00:05:28.972 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@101 -- # echo nr_hugepages=1024 00:05:28.972 nr_hugepages=1024 00:05:28.972 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@102 -- # echo resv_hugepages=0 00:05:28.972 resv_hugepages=0 00:05:28.972 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@103 -- # echo surplus_hugepages=0 00:05:28.972 surplus_hugepages=0 00:05:28.972 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@104 -- # echo anon_hugepages=0 00:05:28.972 anon_hugepages=0 00:05:28.972 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@106 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:28.972 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@108 -- # (( 1024 == nr_hugepages )) 00:05:28.972 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # get_meminfo HugePages_Total 00:05:28.972 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:28.972 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:28.972 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:28.972 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:28.972 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:28.972 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:28.972 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:28.972 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:28.972 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:28.972 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.972 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.973 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283772 kB' 'MemFree: 40586608 kB' 'MemAvailable: 44313832 kB' 'Buffers: 8940 kB' 'Cached: 13233248 kB' 'SwapCached: 0 kB' 'Active: 10523292 kB' 'Inactive: 3688224 kB' 'Active(anon): 10106500 kB' 'Inactive(anon): 0 kB' 'Active(file): 416792 kB' 'Inactive(file): 3688224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 972616 kB' 'Mapped: 145904 kB' 'Shmem: 9137172 kB' 'KReclaimable: 238148 kB' 'Slab: 872972 kB' 'SReclaimable: 238148 kB' 'SUnreclaim: 634824 kB' 'KernelStack: 21840 kB' 'PageTables: 8132 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481912 kB' 'Committed_AS: 11629972 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214400 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 425332 kB' 'DirectMap2M: 9746432 kB' 'DirectMap1G: 59768832 kB' 00:05:28.973 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:28.973 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.973 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.973 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.973 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:28.973 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.973 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.973 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.973 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:28.973 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.973 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.973 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.973 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:28.973 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.973 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.973 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.973 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:28.973 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.973 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.973 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.973 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:28.973 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.973 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.973 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.973 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:28.973 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.973 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.973 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.973 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:28.973 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.973 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.973 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.973 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:28.973 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.973 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.973 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.973 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:28.973 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.973 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.973 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.973 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:28.973 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.973 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.973 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.973 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:28.973 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.973 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.973 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.973 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:28.973 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.973 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.973 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.973 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:28.973 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.973 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.973 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.973 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:28.973 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.973 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.973 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.973 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:28.973 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.973 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.973 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.973 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:28.973 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.973 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.973 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.973 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:28.973 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.973 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.973 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.973 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:28.973 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.973 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.973 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.973 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:28.973 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.973 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.973 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.973 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:28.973 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.973 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.973 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.973 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:28.973 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.973 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.973 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.973 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:28.973 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.973 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.973 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.973 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:28.973 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.973 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.973 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.973 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:28.973 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.973 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.973 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.973 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:28.973 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.973 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.973 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 1024 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@111 -- # get_nodes 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@26 -- # local node 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=1024 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=0 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@31 -- # no_nodes=2 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@32 -- # (( no_nodes > 0 )) 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 0 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node=0 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:28.974 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:28.975 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.975 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.975 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32585368 kB' 'MemFree: 17488484 kB' 'MemUsed: 15096884 kB' 'SwapCached: 0 kB' 'Active: 7543488 kB' 'Inactive: 3531400 kB' 'Active(anon): 7266348 kB' 'Inactive(anon): 0 kB' 'Active(file): 277140 kB' 'Inactive(file): 3531400 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 10518592 kB' 'Mapped: 128116 kB' 'AnonPages: 559432 kB' 'Shmem: 6710052 kB' 'KernelStack: 12696 kB' 'PageTables: 5488 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 159296 kB' 'Slab: 473584 kB' 'SReclaimable: 159296 kB' 'SUnreclaim: 314288 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:05:28.975 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.975 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.975 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.975 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.975 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.975 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.975 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.975 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.975 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.975 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.975 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.975 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.975 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.975 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.975 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.975 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.975 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.975 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.975 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.975 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.975 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.975 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.975 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.975 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.975 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.975 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.975 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.975 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.975 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.975 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.975 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.975 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.975 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.975 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.975 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.975 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.975 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.975 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.975 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.975 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.975 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.975 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.975 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.975 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.975 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.975 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.975 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.975 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.975 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.975 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.975 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.975 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.975 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.975 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.975 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.975 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.975 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.975 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.975 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.975 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.975 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.975 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.975 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.975 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.975 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.975 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.975 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.975 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.975 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.975 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.975 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.975 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.975 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.975 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.975 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.975 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.975 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.975 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.975 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.975 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.975 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.975 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.975 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.975 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.975 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.975 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.976 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.976 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.976 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.976 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.976 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.976 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.976 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.976 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.976 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.976 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.976 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.976 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.976 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.976 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.976 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.976 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.976 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.976 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.976 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.976 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.976 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.976 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.976 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.976 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.976 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.976 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.976 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.976 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.976 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.976 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.976 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.976 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.976 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.976 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.976 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.976 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.976 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.976 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.976 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.976 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.976 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.976 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.976 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.976 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.976 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.976 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.976 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.976 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.976 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.976 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.976 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.976 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.976 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.976 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.976 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.976 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:28.976 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:28.976 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:28.976 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:28.976 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:28.976 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:28.976 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:05:28.976 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:05:28.976 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:05:28.976 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:05:28.976 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # echo 'node0=1024 expecting 1024' 00:05:28.976 node0=1024 expecting 1024 00:05:28.976 14:18:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@129 -- # [[ 1024 == \1\0\2\4 ]] 00:05:28.976 00:05:28.976 real 0m7.371s 00:05:28.976 user 0m2.633s 00:05:28.976 sys 0m4.866s 00:05:28.976 14:18:24 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:28.976 14:18:24 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@10 -- # set +x 00:05:28.976 ************************************ 00:05:28.976 END TEST no_shrink_alloc 00:05:28.976 ************************************ 00:05:28.976 14:18:24 setup.sh.hugepages -- setup/hugepages.sh@206 -- # clear_hp 00:05:28.976 14:18:24 setup.sh.hugepages -- setup/hugepages.sh@36 -- # local node hp 00:05:28.976 14:18:24 setup.sh.hugepages -- setup/hugepages.sh@38 -- # for node in "${!nodes_sys[@]}" 00:05:28.976 14:18:24 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:28.976 14:18:24 setup.sh.hugepages -- setup/hugepages.sh@40 -- # echo 0 00:05:28.976 14:18:24 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:28.976 14:18:24 setup.sh.hugepages -- setup/hugepages.sh@40 -- # echo 0 00:05:28.976 14:18:24 setup.sh.hugepages -- setup/hugepages.sh@38 -- # for node in "${!nodes_sys[@]}" 00:05:28.976 14:18:24 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:28.976 14:18:24 setup.sh.hugepages -- setup/hugepages.sh@40 -- # echo 0 00:05:28.976 14:18:24 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:28.976 14:18:24 setup.sh.hugepages -- setup/hugepages.sh@40 -- # echo 0 00:05:28.976 14:18:24 setup.sh.hugepages -- setup/hugepages.sh@44 -- # export CLEAR_HUGE=yes 00:05:28.976 14:18:24 setup.sh.hugepages -- setup/hugepages.sh@44 -- # CLEAR_HUGE=yes 00:05:28.976 00:05:28.976 real 0m24.634s 00:05:28.976 user 0m8.563s 00:05:28.976 sys 0m15.005s 00:05:28.976 14:18:24 setup.sh.hugepages -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:28.976 14:18:24 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:05:28.976 ************************************ 00:05:28.976 END TEST hugepages 00:05:28.976 ************************************ 00:05:28.976 14:18:25 setup.sh -- setup/test-setup.sh@14 -- # run_test driver /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/driver.sh 00:05:28.976 14:18:25 setup.sh -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:05:28.976 14:18:25 setup.sh -- common/autotest_common.sh@1111 -- # xtrace_disable 00:05:28.976 14:18:25 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:05:28.976 ************************************ 00:05:28.976 START TEST driver 00:05:28.976 ************************************ 00:05:28.976 14:18:25 setup.sh.driver -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/driver.sh 00:05:29.238 * Looking for test storage... 00:05:29.238 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:05:29.238 14:18:25 setup.sh.driver -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:05:29.238 14:18:25 setup.sh.driver -- common/autotest_common.sh@1693 -- # lcov --version 00:05:29.238 14:18:25 setup.sh.driver -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:05:29.238 14:18:25 setup.sh.driver -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:05:29.238 14:18:25 setup.sh.driver -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:05:29.238 14:18:25 setup.sh.driver -- scripts/common.sh@333 -- # local ver1 ver1_l 00:05:29.238 14:18:25 setup.sh.driver -- scripts/common.sh@334 -- # local ver2 ver2_l 00:05:29.238 14:18:25 setup.sh.driver -- scripts/common.sh@336 -- # IFS=.-: 00:05:29.238 14:18:25 setup.sh.driver -- scripts/common.sh@336 -- # read -ra ver1 00:05:29.238 14:18:25 setup.sh.driver -- scripts/common.sh@337 -- # IFS=.-: 00:05:29.238 14:18:25 setup.sh.driver -- scripts/common.sh@337 -- # read -ra ver2 00:05:29.238 14:18:25 setup.sh.driver -- scripts/common.sh@338 -- # local 'op=<' 00:05:29.238 14:18:25 setup.sh.driver -- scripts/common.sh@340 -- # ver1_l=2 00:05:29.238 14:18:25 setup.sh.driver -- scripts/common.sh@341 -- # ver2_l=1 00:05:29.238 14:18:25 setup.sh.driver -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:05:29.238 14:18:25 setup.sh.driver -- scripts/common.sh@344 -- # case "$op" in 00:05:29.238 14:18:25 setup.sh.driver -- scripts/common.sh@345 -- # : 1 00:05:29.238 14:18:25 setup.sh.driver -- scripts/common.sh@364 -- # (( v = 0 )) 00:05:29.238 14:18:25 setup.sh.driver -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:29.238 14:18:25 setup.sh.driver -- scripts/common.sh@365 -- # decimal 1 00:05:29.238 14:18:25 setup.sh.driver -- scripts/common.sh@353 -- # local d=1 00:05:29.238 14:18:25 setup.sh.driver -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:29.238 14:18:25 setup.sh.driver -- scripts/common.sh@355 -- # echo 1 00:05:29.238 14:18:25 setup.sh.driver -- scripts/common.sh@365 -- # ver1[v]=1 00:05:29.238 14:18:25 setup.sh.driver -- scripts/common.sh@366 -- # decimal 2 00:05:29.238 14:18:25 setup.sh.driver -- scripts/common.sh@353 -- # local d=2 00:05:29.238 14:18:25 setup.sh.driver -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:29.238 14:18:25 setup.sh.driver -- scripts/common.sh@355 -- # echo 2 00:05:29.238 14:18:25 setup.sh.driver -- scripts/common.sh@366 -- # ver2[v]=2 00:05:29.238 14:18:25 setup.sh.driver -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:05:29.238 14:18:25 setup.sh.driver -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:05:29.238 14:18:25 setup.sh.driver -- scripts/common.sh@368 -- # return 0 00:05:29.238 14:18:25 setup.sh.driver -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:29.238 14:18:25 setup.sh.driver -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:05:29.238 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:29.238 --rc genhtml_branch_coverage=1 00:05:29.238 --rc genhtml_function_coverage=1 00:05:29.238 --rc genhtml_legend=1 00:05:29.238 --rc geninfo_all_blocks=1 00:05:29.238 --rc geninfo_unexecuted_blocks=1 00:05:29.238 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:29.238 ' 00:05:29.238 14:18:25 setup.sh.driver -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:05:29.238 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:29.238 --rc genhtml_branch_coverage=1 00:05:29.238 --rc genhtml_function_coverage=1 00:05:29.238 --rc genhtml_legend=1 00:05:29.238 --rc geninfo_all_blocks=1 00:05:29.238 --rc geninfo_unexecuted_blocks=1 00:05:29.238 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:29.238 ' 00:05:29.238 14:18:25 setup.sh.driver -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:05:29.238 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:29.238 --rc genhtml_branch_coverage=1 00:05:29.238 --rc genhtml_function_coverage=1 00:05:29.238 --rc genhtml_legend=1 00:05:29.238 --rc geninfo_all_blocks=1 00:05:29.238 --rc geninfo_unexecuted_blocks=1 00:05:29.238 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:29.238 ' 00:05:29.238 14:18:25 setup.sh.driver -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:05:29.238 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:29.238 --rc genhtml_branch_coverage=1 00:05:29.238 --rc genhtml_function_coverage=1 00:05:29.238 --rc genhtml_legend=1 00:05:29.238 --rc geninfo_all_blocks=1 00:05:29.238 --rc geninfo_unexecuted_blocks=1 00:05:29.238 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:29.238 ' 00:05:29.238 14:18:25 setup.sh.driver -- setup/driver.sh@68 -- # setup reset 00:05:29.238 14:18:25 setup.sh.driver -- setup/common.sh@9 -- # [[ reset == output ]] 00:05:29.238 14:18:25 setup.sh.driver -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:05:34.527 14:18:30 setup.sh.driver -- setup/driver.sh@69 -- # run_test guess_driver guess_driver 00:05:34.527 14:18:30 setup.sh.driver -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:05:34.527 14:18:30 setup.sh.driver -- common/autotest_common.sh@1111 -- # xtrace_disable 00:05:34.527 14:18:30 setup.sh.driver -- common/autotest_common.sh@10 -- # set +x 00:05:34.527 ************************************ 00:05:34.527 START TEST guess_driver 00:05:34.527 ************************************ 00:05:34.527 14:18:30 setup.sh.driver.guess_driver -- common/autotest_common.sh@1129 -- # guess_driver 00:05:34.527 14:18:30 setup.sh.driver.guess_driver -- setup/driver.sh@46 -- # local driver setup_driver marker 00:05:34.527 14:18:30 setup.sh.driver.guess_driver -- setup/driver.sh@47 -- # local fail=0 00:05:34.527 14:18:30 setup.sh.driver.guess_driver -- setup/driver.sh@49 -- # pick_driver 00:05:34.527 14:18:30 setup.sh.driver.guess_driver -- setup/driver.sh@36 -- # vfio 00:05:34.527 14:18:30 setup.sh.driver.guess_driver -- setup/driver.sh@21 -- # local iommu_grups 00:05:34.527 14:18:30 setup.sh.driver.guess_driver -- setup/driver.sh@22 -- # local unsafe_vfio 00:05:34.527 14:18:30 setup.sh.driver.guess_driver -- setup/driver.sh@24 -- # [[ -e /sys/module/vfio/parameters/enable_unsafe_noiommu_mode ]] 00:05:34.527 14:18:30 setup.sh.driver.guess_driver -- setup/driver.sh@25 -- # unsafe_vfio=N 00:05:34.527 14:18:30 setup.sh.driver.guess_driver -- setup/driver.sh@27 -- # iommu_groups=(/sys/kernel/iommu_groups/*) 00:05:34.527 14:18:30 setup.sh.driver.guess_driver -- setup/driver.sh@29 -- # (( 176 > 0 )) 00:05:34.527 14:18:30 setup.sh.driver.guess_driver -- setup/driver.sh@30 -- # is_driver vfio_pci 00:05:34.527 14:18:30 setup.sh.driver.guess_driver -- setup/driver.sh@14 -- # mod vfio_pci 00:05:34.527 14:18:30 setup.sh.driver.guess_driver -- setup/driver.sh@12 -- # dep vfio_pci 00:05:34.527 14:18:30 setup.sh.driver.guess_driver -- setup/driver.sh@11 -- # modprobe --show-depends vfio_pci 00:05:34.527 14:18:30 setup.sh.driver.guess_driver -- setup/driver.sh@12 -- # [[ insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/virt/lib/irqbypass.ko.xz 00:05:34.527 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:05:34.527 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:05:34.527 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:05:34.527 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:05:34.527 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/vfio_iommu_type1.ko.xz 00:05:34.527 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/pci/vfio-pci-core.ko.xz 00:05:34.527 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/pci/vfio-pci.ko.xz == *\.\k\o* ]] 00:05:34.527 14:18:30 setup.sh.driver.guess_driver -- setup/driver.sh@30 -- # return 0 00:05:34.527 14:18:30 setup.sh.driver.guess_driver -- setup/driver.sh@37 -- # echo vfio-pci 00:05:34.527 14:18:30 setup.sh.driver.guess_driver -- setup/driver.sh@49 -- # driver=vfio-pci 00:05:34.527 14:18:30 setup.sh.driver.guess_driver -- setup/driver.sh@51 -- # [[ vfio-pci == \N\o\ \v\a\l\i\d\ \d\r\i\v\e\r\ \f\o\u\n\d ]] 00:05:34.527 14:18:30 setup.sh.driver.guess_driver -- setup/driver.sh@56 -- # echo 'Looking for driver=vfio-pci' 00:05:34.527 Looking for driver=vfio-pci 00:05:34.527 14:18:30 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:34.527 14:18:30 setup.sh.driver.guess_driver -- setup/driver.sh@45 -- # setup output config 00:05:34.527 14:18:30 setup.sh.driver.guess_driver -- setup/common.sh@9 -- # [[ output == output ]] 00:05:34.527 14:18:30 setup.sh.driver.guess_driver -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:05:37.830 14:18:33 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:37.830 14:18:33 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:37.830 14:18:33 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:37.830 14:18:33 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:37.830 14:18:33 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:37.830 14:18:33 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:37.830 14:18:33 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:37.830 14:18:33 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:37.830 14:18:33 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:37.830 14:18:33 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:37.830 14:18:33 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:37.830 14:18:33 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:37.831 14:18:33 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:37.831 14:18:33 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:37.831 14:18:33 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:37.831 14:18:33 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:37.831 14:18:33 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:37.831 14:18:33 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:37.831 14:18:33 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:37.831 14:18:33 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:37.831 14:18:33 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:37.831 14:18:33 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:37.831 14:18:33 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:37.831 14:18:33 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:37.831 14:18:33 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:37.831 14:18:33 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:37.831 14:18:33 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:37.831 14:18:33 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:37.831 14:18:33 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:37.831 14:18:33 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:37.831 14:18:33 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:37.831 14:18:33 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:37.831 14:18:33 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:37.831 14:18:33 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:37.831 14:18:33 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:37.831 14:18:33 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:37.831 14:18:33 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:37.831 14:18:33 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:37.831 14:18:33 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:37.831 14:18:33 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:37.831 14:18:33 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:37.831 14:18:33 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:37.831 14:18:33 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:37.831 14:18:33 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:37.831 14:18:33 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:37.831 14:18:33 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:37.831 14:18:33 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:37.831 14:18:33 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:39.744 14:18:35 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:39.744 14:18:35 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:39.744 14:18:35 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:39.744 14:18:35 setup.sh.driver.guess_driver -- setup/driver.sh@64 -- # (( fail == 0 )) 00:05:39.744 14:18:35 setup.sh.driver.guess_driver -- setup/driver.sh@65 -- # setup reset 00:05:39.744 14:18:35 setup.sh.driver.guess_driver -- setup/common.sh@9 -- # [[ reset == output ]] 00:05:39.744 14:18:35 setup.sh.driver.guess_driver -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:05:45.032 00:05:45.032 real 0m10.266s 00:05:45.032 user 0m2.753s 00:05:45.032 sys 0m5.185s 00:05:45.032 14:18:40 setup.sh.driver.guess_driver -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:45.032 14:18:40 setup.sh.driver.guess_driver -- common/autotest_common.sh@10 -- # set +x 00:05:45.032 ************************************ 00:05:45.032 END TEST guess_driver 00:05:45.032 ************************************ 00:05:45.032 00:05:45.032 real 0m15.490s 00:05:45.032 user 0m4.221s 00:05:45.032 sys 0m8.127s 00:05:45.032 14:18:40 setup.sh.driver -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:45.032 14:18:40 setup.sh.driver -- common/autotest_common.sh@10 -- # set +x 00:05:45.032 ************************************ 00:05:45.032 END TEST driver 00:05:45.032 ************************************ 00:05:45.032 14:18:40 setup.sh -- setup/test-setup.sh@15 -- # run_test devices /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/devices.sh 00:05:45.032 14:18:40 setup.sh -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:05:45.032 14:18:40 setup.sh -- common/autotest_common.sh@1111 -- # xtrace_disable 00:05:45.032 14:18:40 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:05:45.032 ************************************ 00:05:45.032 START TEST devices 00:05:45.032 ************************************ 00:05:45.032 14:18:40 setup.sh.devices -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/devices.sh 00:05:45.032 * Looking for test storage... 00:05:45.032 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:05:45.032 14:18:40 setup.sh.devices -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:05:45.032 14:18:40 setup.sh.devices -- common/autotest_common.sh@1693 -- # lcov --version 00:05:45.032 14:18:40 setup.sh.devices -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:05:45.032 14:18:40 setup.sh.devices -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:05:45.032 14:18:40 setup.sh.devices -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:05:45.032 14:18:40 setup.sh.devices -- scripts/common.sh@333 -- # local ver1 ver1_l 00:05:45.032 14:18:40 setup.sh.devices -- scripts/common.sh@334 -- # local ver2 ver2_l 00:05:45.032 14:18:40 setup.sh.devices -- scripts/common.sh@336 -- # IFS=.-: 00:05:45.032 14:18:40 setup.sh.devices -- scripts/common.sh@336 -- # read -ra ver1 00:05:45.032 14:18:40 setup.sh.devices -- scripts/common.sh@337 -- # IFS=.-: 00:05:45.032 14:18:40 setup.sh.devices -- scripts/common.sh@337 -- # read -ra ver2 00:05:45.032 14:18:40 setup.sh.devices -- scripts/common.sh@338 -- # local 'op=<' 00:05:45.032 14:18:40 setup.sh.devices -- scripts/common.sh@340 -- # ver1_l=2 00:05:45.032 14:18:40 setup.sh.devices -- scripts/common.sh@341 -- # ver2_l=1 00:05:45.032 14:18:40 setup.sh.devices -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:05:45.032 14:18:40 setup.sh.devices -- scripts/common.sh@344 -- # case "$op" in 00:05:45.032 14:18:40 setup.sh.devices -- scripts/common.sh@345 -- # : 1 00:05:45.032 14:18:40 setup.sh.devices -- scripts/common.sh@364 -- # (( v = 0 )) 00:05:45.032 14:18:40 setup.sh.devices -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:45.032 14:18:40 setup.sh.devices -- scripts/common.sh@365 -- # decimal 1 00:05:45.032 14:18:40 setup.sh.devices -- scripts/common.sh@353 -- # local d=1 00:05:45.032 14:18:40 setup.sh.devices -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:45.032 14:18:40 setup.sh.devices -- scripts/common.sh@355 -- # echo 1 00:05:45.032 14:18:40 setup.sh.devices -- scripts/common.sh@365 -- # ver1[v]=1 00:05:45.032 14:18:40 setup.sh.devices -- scripts/common.sh@366 -- # decimal 2 00:05:45.032 14:18:40 setup.sh.devices -- scripts/common.sh@353 -- # local d=2 00:05:45.032 14:18:40 setup.sh.devices -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:45.032 14:18:40 setup.sh.devices -- scripts/common.sh@355 -- # echo 2 00:05:45.032 14:18:40 setup.sh.devices -- scripts/common.sh@366 -- # ver2[v]=2 00:05:45.032 14:18:40 setup.sh.devices -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:05:45.032 14:18:40 setup.sh.devices -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:05:45.032 14:18:40 setup.sh.devices -- scripts/common.sh@368 -- # return 0 00:05:45.032 14:18:40 setup.sh.devices -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:45.032 14:18:40 setup.sh.devices -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:05:45.032 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:45.032 --rc genhtml_branch_coverage=1 00:05:45.032 --rc genhtml_function_coverage=1 00:05:45.032 --rc genhtml_legend=1 00:05:45.032 --rc geninfo_all_blocks=1 00:05:45.032 --rc geninfo_unexecuted_blocks=1 00:05:45.032 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:45.032 ' 00:05:45.032 14:18:40 setup.sh.devices -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:05:45.032 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:45.032 --rc genhtml_branch_coverage=1 00:05:45.032 --rc genhtml_function_coverage=1 00:05:45.032 --rc genhtml_legend=1 00:05:45.032 --rc geninfo_all_blocks=1 00:05:45.032 --rc geninfo_unexecuted_blocks=1 00:05:45.032 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:45.032 ' 00:05:45.032 14:18:40 setup.sh.devices -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:05:45.032 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:45.032 --rc genhtml_branch_coverage=1 00:05:45.032 --rc genhtml_function_coverage=1 00:05:45.032 --rc genhtml_legend=1 00:05:45.032 --rc geninfo_all_blocks=1 00:05:45.032 --rc geninfo_unexecuted_blocks=1 00:05:45.032 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:45.032 ' 00:05:45.032 14:18:40 setup.sh.devices -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:05:45.032 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:45.032 --rc genhtml_branch_coverage=1 00:05:45.032 --rc genhtml_function_coverage=1 00:05:45.032 --rc genhtml_legend=1 00:05:45.032 --rc geninfo_all_blocks=1 00:05:45.032 --rc geninfo_unexecuted_blocks=1 00:05:45.032 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:45.032 ' 00:05:45.032 14:18:40 setup.sh.devices -- setup/devices.sh@190 -- # trap cleanup EXIT 00:05:45.032 14:18:40 setup.sh.devices -- setup/devices.sh@192 -- # setup reset 00:05:45.032 14:18:40 setup.sh.devices -- setup/common.sh@9 -- # [[ reset == output ]] 00:05:45.033 14:18:40 setup.sh.devices -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:05:49.236 14:18:44 setup.sh.devices -- setup/devices.sh@194 -- # get_zoned_devs 00:05:49.236 14:18:44 setup.sh.devices -- common/autotest_common.sh@1657 -- # zoned_devs=() 00:05:49.236 14:18:44 setup.sh.devices -- common/autotest_common.sh@1657 -- # local -gA zoned_devs 00:05:49.236 14:18:44 setup.sh.devices -- common/autotest_common.sh@1658 -- # local nvme bdf 00:05:49.236 14:18:44 setup.sh.devices -- common/autotest_common.sh@1660 -- # for nvme in /sys/block/nvme* 00:05:49.236 14:18:44 setup.sh.devices -- common/autotest_common.sh@1661 -- # is_block_zoned nvme0n1 00:05:49.236 14:18:44 setup.sh.devices -- common/autotest_common.sh@1650 -- # local device=nvme0n1 00:05:49.236 14:18:44 setup.sh.devices -- common/autotest_common.sh@1652 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:05:49.236 14:18:44 setup.sh.devices -- common/autotest_common.sh@1653 -- # [[ none != none ]] 00:05:49.236 14:18:44 setup.sh.devices -- setup/devices.sh@196 -- # blocks=() 00:05:49.236 14:18:44 setup.sh.devices -- setup/devices.sh@196 -- # declare -a blocks 00:05:49.237 14:18:44 setup.sh.devices -- setup/devices.sh@197 -- # blocks_to_pci=() 00:05:49.237 14:18:44 setup.sh.devices -- setup/devices.sh@197 -- # declare -A blocks_to_pci 00:05:49.237 14:18:44 setup.sh.devices -- setup/devices.sh@198 -- # min_disk_size=3221225472 00:05:49.237 14:18:44 setup.sh.devices -- setup/devices.sh@200 -- # for block in "/sys/block/nvme"!(*c*) 00:05:49.237 14:18:44 setup.sh.devices -- setup/devices.sh@201 -- # ctrl=nvme0n1 00:05:49.237 14:18:44 setup.sh.devices -- setup/devices.sh@201 -- # ctrl=nvme0 00:05:49.237 14:18:44 setup.sh.devices -- setup/devices.sh@202 -- # pci=0000:d8:00.0 00:05:49.237 14:18:44 setup.sh.devices -- setup/devices.sh@203 -- # [[ '' == *\0\0\0\0\:\d\8\:\0\0\.\0* ]] 00:05:49.237 14:18:44 setup.sh.devices -- setup/devices.sh@204 -- # block_in_use nvme0n1 00:05:49.237 14:18:44 setup.sh.devices -- scripts/common.sh@381 -- # local block=nvme0n1 pt 00:05:49.237 14:18:44 setup.sh.devices -- scripts/common.sh@390 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/spdk-gpt.py nvme0n1 00:05:49.237 No valid GPT data, bailing 00:05:49.237 14:18:44 setup.sh.devices -- scripts/common.sh@394 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:05:49.237 14:18:44 setup.sh.devices -- scripts/common.sh@394 -- # pt= 00:05:49.237 14:18:44 setup.sh.devices -- scripts/common.sh@395 -- # return 1 00:05:49.237 14:18:44 setup.sh.devices -- setup/devices.sh@204 -- # sec_size_to_bytes nvme0n1 00:05:49.237 14:18:44 setup.sh.devices -- setup/common.sh@76 -- # local dev=nvme0n1 00:05:49.237 14:18:44 setup.sh.devices -- setup/common.sh@78 -- # [[ -e /sys/block/nvme0n1 ]] 00:05:49.237 14:18:44 setup.sh.devices -- setup/common.sh@80 -- # echo 1600321314816 00:05:49.237 14:18:44 setup.sh.devices -- setup/devices.sh@204 -- # (( 1600321314816 >= min_disk_size )) 00:05:49.237 14:18:44 setup.sh.devices -- setup/devices.sh@205 -- # blocks+=("${block##*/}") 00:05:49.237 14:18:44 setup.sh.devices -- setup/devices.sh@206 -- # blocks_to_pci["${block##*/}"]=0000:d8:00.0 00:05:49.237 14:18:44 setup.sh.devices -- setup/devices.sh@209 -- # (( 1 > 0 )) 00:05:49.237 14:18:44 setup.sh.devices -- setup/devices.sh@211 -- # declare -r test_disk=nvme0n1 00:05:49.237 14:18:44 setup.sh.devices -- setup/devices.sh@213 -- # run_test nvme_mount nvme_mount 00:05:49.237 14:18:44 setup.sh.devices -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:05:49.237 14:18:44 setup.sh.devices -- common/autotest_common.sh@1111 -- # xtrace_disable 00:05:49.237 14:18:44 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:05:49.237 ************************************ 00:05:49.237 START TEST nvme_mount 00:05:49.237 ************************************ 00:05:49.237 14:18:44 setup.sh.devices.nvme_mount -- common/autotest_common.sh@1129 -- # nvme_mount 00:05:49.237 14:18:44 setup.sh.devices.nvme_mount -- setup/devices.sh@95 -- # nvme_disk=nvme0n1 00:05:49.237 14:18:44 setup.sh.devices.nvme_mount -- setup/devices.sh@96 -- # nvme_disk_p=nvme0n1p1 00:05:49.237 14:18:44 setup.sh.devices.nvme_mount -- setup/devices.sh@97 -- # nvme_mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:49.237 14:18:44 setup.sh.devices.nvme_mount -- setup/devices.sh@98 -- # nvme_dummy_test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:49.237 14:18:44 setup.sh.devices.nvme_mount -- setup/devices.sh@101 -- # partition_drive nvme0n1 1 00:05:49.237 14:18:44 setup.sh.devices.nvme_mount -- setup/common.sh@39 -- # local disk=nvme0n1 00:05:49.237 14:18:44 setup.sh.devices.nvme_mount -- setup/common.sh@40 -- # local part_no=1 00:05:49.237 14:18:44 setup.sh.devices.nvme_mount -- setup/common.sh@41 -- # local size=1073741824 00:05:49.237 14:18:44 setup.sh.devices.nvme_mount -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:05:49.237 14:18:44 setup.sh.devices.nvme_mount -- setup/common.sh@44 -- # parts=() 00:05:49.237 14:18:44 setup.sh.devices.nvme_mount -- setup/common.sh@44 -- # local parts 00:05:49.237 14:18:44 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part = 1 )) 00:05:49.237 14:18:44 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:49.237 14:18:44 setup.sh.devices.nvme_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:05:49.237 14:18:44 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part++ )) 00:05:49.237 14:18:44 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:49.237 14:18:44 setup.sh.devices.nvme_mount -- setup/common.sh@51 -- # (( size /= 512 )) 00:05:49.237 14:18:44 setup.sh.devices.nvme_mount -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:05:49.237 14:18:44 setup.sh.devices.nvme_mount -- setup/common.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 00:05:49.807 Creating new GPT entries in memory. 00:05:49.807 GPT data structures destroyed! You may now partition the disk using fdisk or 00:05:49.807 other utilities. 00:05:49.807 14:18:45 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part = 1 )) 00:05:49.807 14:18:45 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:49.807 14:18:45 setup.sh.devices.nvme_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:05:49.807 14:18:45 setup.sh.devices.nvme_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:05:49.807 14:18:45 setup.sh.devices.nvme_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:05:51.190 Creating new GPT entries in memory. 00:05:51.190 The operation has completed successfully. 00:05:51.190 14:18:46 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part++ )) 00:05:51.190 14:18:46 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:51.190 14:18:46 setup.sh.devices.nvme_mount -- setup/common.sh@62 -- # wait 292122 00:05:51.190 14:18:46 setup.sh.devices.nvme_mount -- setup/devices.sh@102 -- # mkfs /dev/nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:51.190 14:18:46 setup.sh.devices.nvme_mount -- setup/common.sh@66 -- # local dev=/dev/nvme0n1p1 mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount size= 00:05:51.190 14:18:46 setup.sh.devices.nvme_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:51.190 14:18:46 setup.sh.devices.nvme_mount -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1p1 ]] 00:05:51.190 14:18:46 setup.sh.devices.nvme_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1p1 00:05:51.190 14:18:46 setup.sh.devices.nvme_mount -- setup/common.sh@72 -- # mount /dev/nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:51.191 14:18:47 setup.sh.devices.nvme_mount -- setup/devices.sh@105 -- # verify 0000:d8:00.0 nvme0n1:nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:51.191 14:18:47 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:05:51.191 14:18:47 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1p1 00:05:51.191 14:18:47 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:51.191 14:18:47 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:51.191 14:18:47 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:05:51.191 14:18:47 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:05:51.191 14:18:47 setup.sh.devices.nvme_mount -- setup/devices.sh@56 -- # : 00:05:51.191 14:18:47 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:05:51.191 14:18:47 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:51.191 14:18:47 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:05:51.191 14:18:47 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:05:51.191 14:18:47 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:51.191 14:18:47 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:05:54.566 14:18:50 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:54.566 14:18:50 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:54.566 14:18:50 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:54.566 14:18:50 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:54.566 14:18:50 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:54.566 14:18:50 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:54.566 14:18:50 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:54.566 14:18:50 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:54.566 14:18:50 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:54.566 14:18:50 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:54.566 14:18:50 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:54.566 14:18:50 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:54.566 14:18:50 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:54.566 14:18:50 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:54.566 14:18:50 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:54.566 14:18:50 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:54.566 14:18:50 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:54.566 14:18:50 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:54.566 14:18:50 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:54.566 14:18:50 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:54.566 14:18:50 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:54.566 14:18:50 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:54.566 14:18:50 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:54.566 14:18:50 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:54.566 14:18:50 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:54.566 14:18:50 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:54.566 14:18:50 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:54.566 14:18:50 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:54.566 14:18:50 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:54.566 14:18:50 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:54.566 14:18:50 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:54.566 14:18:50 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:54.566 14:18:50 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:54.566 14:18:50 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1p1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1\p\1* ]] 00:05:54.566 14:18:50 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:05:54.566 14:18:50 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:54.566 14:18:50 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:54.566 14:18:50 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount ]] 00:05:54.566 14:18:50 setup.sh.devices.nvme_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:54.566 14:18:50 setup.sh.devices.nvme_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:05:54.566 14:18:50 setup.sh.devices.nvme_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:54.566 14:18:50 setup.sh.devices.nvme_mount -- setup/devices.sh@110 -- # cleanup_nvme 00:05:54.566 14:18:50 setup.sh.devices.nvme_mount -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:54.566 14:18:50 setup.sh.devices.nvme_mount -- setup/devices.sh@21 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:54.566 14:18:50 setup.sh.devices.nvme_mount -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:54.566 14:18:50 setup.sh.devices.nvme_mount -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:05:54.566 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:05:54.566 14:18:50 setup.sh.devices.nvme_mount -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:05:54.567 14:18:50 setup.sh.devices.nvme_mount -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:05:54.844 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:05:54.844 /dev/nvme0n1: 8 bytes were erased at offset 0x1749a955e00 (gpt): 45 46 49 20 50 41 52 54 00:05:54.844 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:05:54.844 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:05:54.844 14:18:50 setup.sh.devices.nvme_mount -- setup/devices.sh@113 -- # mkfs /dev/nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 1024M 00:05:54.844 14:18:50 setup.sh.devices.nvme_mount -- setup/common.sh@66 -- # local dev=/dev/nvme0n1 mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount size=1024M 00:05:54.844 14:18:50 setup.sh.devices.nvme_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:54.844 14:18:50 setup.sh.devices.nvme_mount -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1 ]] 00:05:54.844 14:18:50 setup.sh.devices.nvme_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1 1024M 00:05:54.844 14:18:50 setup.sh.devices.nvme_mount -- setup/common.sh@72 -- # mount /dev/nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:54.845 14:18:50 setup.sh.devices.nvme_mount -- setup/devices.sh@116 -- # verify 0000:d8:00.0 nvme0n1:nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:54.845 14:18:50 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:05:54.845 14:18:50 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1 00:05:54.845 14:18:50 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:54.845 14:18:50 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:54.845 14:18:50 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:05:54.845 14:18:50 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:05:54.845 14:18:50 setup.sh.devices.nvme_mount -- setup/devices.sh@56 -- # : 00:05:54.845 14:18:50 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:05:54.845 14:18:50 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:54.845 14:18:50 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:05:54.845 14:18:50 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:05:54.845 14:18:50 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:54.845 14:18:50 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:05:58.201 14:18:54 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:58.201 14:18:54 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:58.201 14:18:54 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:58.201 14:18:54 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:58.201 14:18:54 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:58.201 14:18:54 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:58.201 14:18:54 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:58.201 14:18:54 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:58.201 14:18:54 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:58.201 14:18:54 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:58.201 14:18:54 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:58.201 14:18:54 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:58.201 14:18:54 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:58.201 14:18:54 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:58.201 14:18:54 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:58.201 14:18:54 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:58.201 14:18:54 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:58.201 14:18:54 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:58.201 14:18:54 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:58.201 14:18:54 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:58.201 14:18:54 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:58.201 14:18:54 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:58.201 14:18:54 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:58.201 14:18:54 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:58.201 14:18:54 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:58.201 14:18:54 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:58.201 14:18:54 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:58.201 14:18:54 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:58.201 14:18:54 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:58.201 14:18:54 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:58.201 14:18:54 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:58.201 14:18:54 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:58.201 14:18:54 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:58.201 14:18:54 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1* ]] 00:05:58.201 14:18:54 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:05:58.201 14:18:54 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:58.201 14:18:54 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:58.201 14:18:54 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount ]] 00:05:58.201 14:18:54 setup.sh.devices.nvme_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:58.201 14:18:54 setup.sh.devices.nvme_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:05:58.201 14:18:54 setup.sh.devices.nvme_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:58.201 14:18:54 setup.sh.devices.nvme_mount -- setup/devices.sh@123 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:58.471 14:18:54 setup.sh.devices.nvme_mount -- setup/devices.sh@125 -- # verify 0000:d8:00.0 data@nvme0n1 '' '' 00:05:58.471 14:18:54 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:05:58.471 14:18:54 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=data@nvme0n1 00:05:58.471 14:18:54 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point= 00:05:58.471 14:18:54 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file= 00:05:58.471 14:18:54 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:05:58.471 14:18:54 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n '' ]] 00:05:58.471 14:18:54 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:05:58.471 14:18:54 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:58.471 14:18:54 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:05:58.471 14:18:54 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:05:58.471 14:18:54 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:58.471 14:18:54 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:06:01.870 14:18:57 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:01.870 14:18:57 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:01.870 14:18:57 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:01.870 14:18:57 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:01.870 14:18:57 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:01.870 14:18:57 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:01.870 14:18:57 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:01.870 14:18:57 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:01.870 14:18:57 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:01.870 14:18:57 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:01.870 14:18:57 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:01.870 14:18:57 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:01.870 14:18:57 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:01.870 14:18:57 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:01.870 14:18:57 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:01.870 14:18:57 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:01.870 14:18:57 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:01.870 14:18:57 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:01.870 14:18:57 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:01.870 14:18:57 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:01.870 14:18:57 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:01.870 14:18:57 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:01.870 14:18:57 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:01.871 14:18:57 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:01.871 14:18:57 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:01.871 14:18:57 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:01.871 14:18:57 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:01.871 14:18:57 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:01.871 14:18:57 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:01.871 14:18:57 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:01.871 14:18:57 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:01.871 14:18:57 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:01.871 14:18:57 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:01.871 14:18:57 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: data@nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\d\a\t\a\@\n\v\m\e\0\n\1* ]] 00:06:01.871 14:18:57 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:06:01.871 14:18:57 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:01.871 14:18:57 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:06:01.871 14:18:57 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n '' ]] 00:06:01.871 14:18:57 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # return 0 00:06:01.871 14:18:57 setup.sh.devices.nvme_mount -- setup/devices.sh@128 -- # cleanup_nvme 00:06:01.871 14:18:57 setup.sh.devices.nvme_mount -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:06:01.871 14:18:57 setup.sh.devices.nvme_mount -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:06:01.871 14:18:57 setup.sh.devices.nvme_mount -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:06:01.871 14:18:57 setup.sh.devices.nvme_mount -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:06:01.871 /dev/nvme0n1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:06:01.871 00:06:01.871 real 0m12.995s 00:06:01.871 user 0m3.799s 00:06:01.871 sys 0m7.139s 00:06:01.871 14:18:57 setup.sh.devices.nvme_mount -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:01.871 14:18:57 setup.sh.devices.nvme_mount -- common/autotest_common.sh@10 -- # set +x 00:06:01.871 ************************************ 00:06:01.871 END TEST nvme_mount 00:06:01.871 ************************************ 00:06:01.871 14:18:57 setup.sh.devices -- setup/devices.sh@214 -- # run_test dm_mount dm_mount 00:06:01.871 14:18:57 setup.sh.devices -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:01.871 14:18:57 setup.sh.devices -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:01.871 14:18:57 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:06:01.871 ************************************ 00:06:01.871 START TEST dm_mount 00:06:01.871 ************************************ 00:06:01.871 14:18:57 setup.sh.devices.dm_mount -- common/autotest_common.sh@1129 -- # dm_mount 00:06:01.871 14:18:57 setup.sh.devices.dm_mount -- setup/devices.sh@144 -- # pv=nvme0n1 00:06:01.871 14:18:57 setup.sh.devices.dm_mount -- setup/devices.sh@145 -- # pv0=nvme0n1p1 00:06:01.871 14:18:57 setup.sh.devices.dm_mount -- setup/devices.sh@146 -- # pv1=nvme0n1p2 00:06:01.871 14:18:57 setup.sh.devices.dm_mount -- setup/devices.sh@148 -- # partition_drive nvme0n1 00:06:01.871 14:18:57 setup.sh.devices.dm_mount -- setup/common.sh@39 -- # local disk=nvme0n1 00:06:01.871 14:18:57 setup.sh.devices.dm_mount -- setup/common.sh@40 -- # local part_no=2 00:06:01.871 14:18:57 setup.sh.devices.dm_mount -- setup/common.sh@41 -- # local size=1073741824 00:06:01.871 14:18:57 setup.sh.devices.dm_mount -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:06:01.871 14:18:57 setup.sh.devices.dm_mount -- setup/common.sh@44 -- # parts=() 00:06:01.871 14:18:57 setup.sh.devices.dm_mount -- setup/common.sh@44 -- # local parts 00:06:01.871 14:18:57 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part = 1 )) 00:06:01.871 14:18:57 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:06:01.871 14:18:57 setup.sh.devices.dm_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:06:01.871 14:18:57 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part++ )) 00:06:01.871 14:18:57 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:06:01.871 14:18:57 setup.sh.devices.dm_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:06:01.871 14:18:57 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part++ )) 00:06:01.871 14:18:57 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:06:01.871 14:18:57 setup.sh.devices.dm_mount -- setup/common.sh@51 -- # (( size /= 512 )) 00:06:01.871 14:18:57 setup.sh.devices.dm_mount -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:06:01.871 14:18:57 setup.sh.devices.dm_mount -- setup/common.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 nvme0n1p2 00:06:02.813 Creating new GPT entries in memory. 00:06:02.813 GPT data structures destroyed! You may now partition the disk using fdisk or 00:06:02.813 other utilities. 00:06:03.074 14:18:58 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part = 1 )) 00:06:03.074 14:18:58 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:06:03.074 14:18:58 setup.sh.devices.dm_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:06:03.074 14:18:58 setup.sh.devices.dm_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:06:03.074 14:18:58 setup.sh.devices.dm_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:06:04.016 Creating new GPT entries in memory. 00:06:04.016 The operation has completed successfully. 00:06:04.016 14:18:59 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part++ )) 00:06:04.016 14:18:59 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:06:04.016 14:18:59 setup.sh.devices.dm_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:06:04.016 14:18:59 setup.sh.devices.dm_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:06:04.016 14:18:59 setup.sh.devices.dm_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=2:2099200:4196351 00:06:04.959 The operation has completed successfully. 00:06:04.959 14:19:00 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part++ )) 00:06:04.959 14:19:00 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:06:04.959 14:19:00 setup.sh.devices.dm_mount -- setup/common.sh@62 -- # wait 296614 00:06:04.959 14:19:01 setup.sh.devices.dm_mount -- setup/devices.sh@150 -- # dm_name=nvme_dm_test 00:06:04.959 14:19:01 setup.sh.devices.dm_mount -- setup/devices.sh@151 -- # dm_mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:06:04.959 14:19:01 setup.sh.devices.dm_mount -- setup/devices.sh@152 -- # dm_dummy_test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:06:04.959 14:19:01 setup.sh.devices.dm_mount -- setup/devices.sh@155 -- # dmsetup create nvme_dm_test 00:06:05.220 14:19:01 setup.sh.devices.dm_mount -- setup/devices.sh@160 -- # for t in {1..5} 00:06:05.220 14:19:01 setup.sh.devices.dm_mount -- setup/devices.sh@161 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:06:05.220 14:19:01 setup.sh.devices.dm_mount -- setup/devices.sh@161 -- # break 00:06:05.220 14:19:01 setup.sh.devices.dm_mount -- setup/devices.sh@164 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:06:05.220 14:19:01 setup.sh.devices.dm_mount -- setup/devices.sh@165 -- # readlink -f /dev/mapper/nvme_dm_test 00:06:05.220 14:19:01 setup.sh.devices.dm_mount -- setup/devices.sh@165 -- # dm=/dev/dm-0 00:06:05.220 14:19:01 setup.sh.devices.dm_mount -- setup/devices.sh@166 -- # dm=dm-0 00:06:05.220 14:19:01 setup.sh.devices.dm_mount -- setup/devices.sh@168 -- # [[ -e /sys/class/block/nvme0n1p1/holders/dm-0 ]] 00:06:05.220 14:19:01 setup.sh.devices.dm_mount -- setup/devices.sh@169 -- # [[ -e /sys/class/block/nvme0n1p2/holders/dm-0 ]] 00:06:05.220 14:19:01 setup.sh.devices.dm_mount -- setup/devices.sh@171 -- # mkfs /dev/mapper/nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:06:05.220 14:19:01 setup.sh.devices.dm_mount -- setup/common.sh@66 -- # local dev=/dev/mapper/nvme_dm_test mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount size= 00:06:05.220 14:19:01 setup.sh.devices.dm_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:06:05.220 14:19:01 setup.sh.devices.dm_mount -- setup/common.sh@70 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:06:05.220 14:19:01 setup.sh.devices.dm_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/mapper/nvme_dm_test 00:06:05.220 14:19:01 setup.sh.devices.dm_mount -- setup/common.sh@72 -- # mount /dev/mapper/nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:06:05.220 14:19:01 setup.sh.devices.dm_mount -- setup/devices.sh@174 -- # verify 0000:d8:00.0 nvme0n1:nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:06:05.220 14:19:01 setup.sh.devices.dm_mount -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:06:05.220 14:19:01 setup.sh.devices.dm_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme_dm_test 00:06:05.220 14:19:01 setup.sh.devices.dm_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:06:05.220 14:19:01 setup.sh.devices.dm_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:06:05.220 14:19:01 setup.sh.devices.dm_mount -- setup/devices.sh@53 -- # local found=0 00:06:05.220 14:19:01 setup.sh.devices.dm_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:06:05.220 14:19:01 setup.sh.devices.dm_mount -- setup/devices.sh@56 -- # : 00:06:05.220 14:19:01 setup.sh.devices.dm_mount -- setup/devices.sh@59 -- # local pci status 00:06:05.220 14:19:01 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:05.220 14:19:01 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:06:05.220 14:19:01 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # setup output config 00:06:05.220 14:19:01 setup.sh.devices.dm_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:06:05.220 14:19:01 setup.sh.devices.dm_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:06:08.522 14:19:04 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:08.522 14:19:04 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:08.523 14:19:04 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:08.523 14:19:04 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:08.523 14:19:04 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:08.523 14:19:04 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:08.523 14:19:04 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:08.523 14:19:04 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:08.523 14:19:04 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:08.523 14:19:04 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:08.523 14:19:04 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:08.523 14:19:04 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:08.523 14:19:04 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:08.523 14:19:04 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:08.523 14:19:04 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:08.523 14:19:04 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:08.523 14:19:04 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:08.523 14:19:04 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:08.523 14:19:04 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:08.523 14:19:04 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:08.523 14:19:04 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:08.523 14:19:04 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:08.523 14:19:04 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:08.523 14:19:04 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:08.523 14:19:04 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:08.523 14:19:04 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:08.523 14:19:04 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:08.523 14:19:04 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:08.523 14:19:04 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:08.523 14:19:04 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:08.523 14:19:04 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:08.523 14:19:04 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:08.523 14:19:04 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:08.523 14:19:04 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0,mount@nvme0n1:nvme_dm_test, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\_\d\m\_\t\e\s\t* ]] 00:06:08.523 14:19:04 setup.sh.devices.dm_mount -- setup/devices.sh@63 -- # found=1 00:06:08.523 14:19:04 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:08.523 14:19:04 setup.sh.devices.dm_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:06:08.523 14:19:04 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount ]] 00:06:08.523 14:19:04 setup.sh.devices.dm_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:06:08.523 14:19:04 setup.sh.devices.dm_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:06:08.523 14:19:04 setup.sh.devices.dm_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:06:08.523 14:19:04 setup.sh.devices.dm_mount -- setup/devices.sh@182 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:06:08.523 14:19:04 setup.sh.devices.dm_mount -- setup/devices.sh@184 -- # verify 0000:d8:00.0 holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 '' '' 00:06:08.523 14:19:04 setup.sh.devices.dm_mount -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:06:08.523 14:19:04 setup.sh.devices.dm_mount -- setup/devices.sh@49 -- # local mounts=holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 00:06:08.523 14:19:04 setup.sh.devices.dm_mount -- setup/devices.sh@50 -- # local mount_point= 00:06:08.523 14:19:04 setup.sh.devices.dm_mount -- setup/devices.sh@51 -- # local test_file= 00:06:08.523 14:19:04 setup.sh.devices.dm_mount -- setup/devices.sh@53 -- # local found=0 00:06:08.523 14:19:04 setup.sh.devices.dm_mount -- setup/devices.sh@55 -- # [[ -n '' ]] 00:06:08.523 14:19:04 setup.sh.devices.dm_mount -- setup/devices.sh@59 -- # local pci status 00:06:08.523 14:19:04 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:08.523 14:19:04 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:06:08.523 14:19:04 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # setup output config 00:06:08.523 14:19:04 setup.sh.devices.dm_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:06:08.523 14:19:04 setup.sh.devices.dm_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:06:11.824 14:19:07 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:11.824 14:19:07 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:11.824 14:19:07 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:11.824 14:19:07 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:11.824 14:19:07 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:11.824 14:19:07 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:11.824 14:19:07 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:11.824 14:19:07 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:11.824 14:19:07 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:11.824 14:19:07 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:11.824 14:19:07 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:11.824 14:19:07 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:11.824 14:19:07 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:11.824 14:19:07 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:11.824 14:19:07 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:11.824 14:19:07 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:11.824 14:19:07 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:11.824 14:19:07 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:11.824 14:19:07 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:11.824 14:19:07 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:11.824 14:19:07 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:11.824 14:19:07 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:11.824 14:19:07 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:11.824 14:19:07 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:11.824 14:19:07 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:11.824 14:19:07 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:11.824 14:19:07 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:11.825 14:19:07 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:11.825 14:19:07 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:11.825 14:19:07 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:11.825 14:19:07 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:11.825 14:19:07 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:11.825 14:19:07 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:06:11.825 14:19:07 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\1\:\d\m\-\0\,\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\2\:\d\m\-\0* ]] 00:06:11.825 14:19:07 setup.sh.devices.dm_mount -- setup/devices.sh@63 -- # found=1 00:06:11.825 14:19:07 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:12.085 14:19:08 setup.sh.devices.dm_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:06:12.085 14:19:08 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # [[ -n '' ]] 00:06:12.085 14:19:08 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # return 0 00:06:12.085 14:19:08 setup.sh.devices.dm_mount -- setup/devices.sh@187 -- # cleanup_dm 00:06:12.085 14:19:08 setup.sh.devices.dm_mount -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:06:12.085 14:19:08 setup.sh.devices.dm_mount -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:06:12.085 14:19:08 setup.sh.devices.dm_mount -- setup/devices.sh@37 -- # dmsetup remove --force nvme_dm_test 00:06:12.085 14:19:08 setup.sh.devices.dm_mount -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:06:12.085 14:19:08 setup.sh.devices.dm_mount -- setup/devices.sh@40 -- # wipefs --all /dev/nvme0n1p1 00:06:12.085 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:06:12.085 14:19:08 setup.sh.devices.dm_mount -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:06:12.085 14:19:08 setup.sh.devices.dm_mount -- setup/devices.sh@43 -- # wipefs --all /dev/nvme0n1p2 00:06:12.085 00:06:12.085 real 0m10.214s 00:06:12.085 user 0m2.509s 00:06:12.085 sys 0m4.802s 00:06:12.085 14:19:08 setup.sh.devices.dm_mount -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:12.085 14:19:08 setup.sh.devices.dm_mount -- common/autotest_common.sh@10 -- # set +x 00:06:12.085 ************************************ 00:06:12.085 END TEST dm_mount 00:06:12.085 ************************************ 00:06:12.085 14:19:08 setup.sh.devices -- setup/devices.sh@1 -- # cleanup 00:06:12.086 14:19:08 setup.sh.devices -- setup/devices.sh@11 -- # cleanup_nvme 00:06:12.086 14:19:08 setup.sh.devices -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:06:12.086 14:19:08 setup.sh.devices -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:06:12.086 14:19:08 setup.sh.devices -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:06:12.086 14:19:08 setup.sh.devices -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:06:12.086 14:19:08 setup.sh.devices -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:06:12.346 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:06:12.346 /dev/nvme0n1: 8 bytes were erased at offset 0x1749a955e00 (gpt): 45 46 49 20 50 41 52 54 00:06:12.346 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:06:12.346 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:06:12.346 14:19:08 setup.sh.devices -- setup/devices.sh@12 -- # cleanup_dm 00:06:12.346 14:19:08 setup.sh.devices -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:06:12.346 14:19:08 setup.sh.devices -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:06:12.346 14:19:08 setup.sh.devices -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:06:12.346 14:19:08 setup.sh.devices -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:06:12.346 14:19:08 setup.sh.devices -- setup/devices.sh@14 -- # [[ -b /dev/nvme0n1 ]] 00:06:12.346 14:19:08 setup.sh.devices -- setup/devices.sh@15 -- # wipefs --all /dev/nvme0n1 00:06:12.606 00:06:12.606 real 0m27.832s 00:06:12.606 user 0m7.926s 00:06:12.606 sys 0m14.874s 00:06:12.606 14:19:08 setup.sh.devices -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:12.606 14:19:08 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:06:12.606 ************************************ 00:06:12.606 END TEST devices 00:06:12.606 ************************************ 00:06:12.606 00:06:12.606 real 1m34.034s 00:06:12.606 user 0m29.152s 00:06:12.606 sys 0m53.890s 00:06:12.606 14:19:08 setup.sh -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:12.606 14:19:08 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:06:12.606 ************************************ 00:06:12.606 END TEST setup.sh 00:06:12.606 ************************************ 00:06:12.606 14:19:08 -- spdk/autotest.sh@115 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:06:15.920 Hugepages 00:06:15.920 node hugesize free / total 00:06:15.920 node0 1048576kB 0 / 0 00:06:15.920 node0 2048kB 1024 / 1024 00:06:15.920 node1 1048576kB 0 / 0 00:06:15.920 node1 2048kB 1024 / 1024 00:06:15.920 00:06:15.920 Type BDF Vendor Device NUMA Driver Device Block devices 00:06:15.920 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:06:15.920 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:06:15.920 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:06:15.920 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:06:15.920 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:06:15.920 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:06:15.920 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:06:15.920 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:06:15.920 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:06:15.920 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:06:15.920 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:06:15.920 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:06:15.920 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:06:15.920 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:06:15.920 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:06:15.920 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:06:16.181 NVMe 0000:d8:00.0 8086 0a54 1 nvme nvme0 nvme0n1 00:06:16.181 14:19:12 -- spdk/autotest.sh@117 -- # uname -s 00:06:16.181 14:19:12 -- spdk/autotest.sh@117 -- # [[ Linux == Linux ]] 00:06:16.181 14:19:12 -- spdk/autotest.sh@119 -- # nvme_namespace_revert 00:06:16.181 14:19:12 -- common/autotest_common.sh@1516 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:06:19.479 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:06:19.479 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:06:19.479 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:06:19.479 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:06:19.479 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:06:19.479 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:06:19.479 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:06:19.479 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:06:19.740 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:06:19.740 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:06:19.740 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:06:19.740 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:06:19.740 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:06:19.740 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:06:19.740 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:06:19.740 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:06:21.122 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:06:21.382 14:19:17 -- common/autotest_common.sh@1517 -- # sleep 1 00:06:22.324 14:19:18 -- common/autotest_common.sh@1518 -- # bdfs=() 00:06:22.324 14:19:18 -- common/autotest_common.sh@1518 -- # local bdfs 00:06:22.324 14:19:18 -- common/autotest_common.sh@1520 -- # bdfs=($(get_nvme_bdfs)) 00:06:22.324 14:19:18 -- common/autotest_common.sh@1520 -- # get_nvme_bdfs 00:06:22.324 14:19:18 -- common/autotest_common.sh@1498 -- # bdfs=() 00:06:22.324 14:19:18 -- common/autotest_common.sh@1498 -- # local bdfs 00:06:22.324 14:19:18 -- common/autotest_common.sh@1499 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:06:22.324 14:19:18 -- common/autotest_common.sh@1499 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/gen_nvme.sh 00:06:22.324 14:19:18 -- common/autotest_common.sh@1499 -- # jq -r '.config[].params.traddr' 00:06:22.324 14:19:18 -- common/autotest_common.sh@1500 -- # (( 1 == 0 )) 00:06:22.324 14:19:18 -- common/autotest_common.sh@1504 -- # printf '%s\n' 0000:d8:00.0 00:06:22.324 14:19:18 -- common/autotest_common.sh@1522 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:06:25.624 Waiting for block devices as requested 00:06:25.885 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:06:25.885 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:06:25.885 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:06:26.145 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:06:26.145 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:06:26.145 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:06:26.405 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:06:26.405 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:06:26.405 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:06:26.666 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:06:26.666 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:06:26.666 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:06:26.927 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:06:26.927 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:06:26.927 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:06:27.187 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:06:27.187 0000:d8:00.0 (8086 0a54): vfio-pci -> nvme 00:06:27.448 14:19:23 -- common/autotest_common.sh@1524 -- # for bdf in "${bdfs[@]}" 00:06:27.448 14:19:23 -- common/autotest_common.sh@1525 -- # get_nvme_ctrlr_from_bdf 0000:d8:00.0 00:06:27.449 14:19:23 -- common/autotest_common.sh@1487 -- # readlink -f /sys/class/nvme/nvme0 00:06:27.449 14:19:23 -- common/autotest_common.sh@1487 -- # grep 0000:d8:00.0/nvme/nvme 00:06:27.449 14:19:23 -- common/autotest_common.sh@1487 -- # bdf_sysfs_path=/sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 00:06:27.449 14:19:23 -- common/autotest_common.sh@1488 -- # [[ -z /sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 ]] 00:06:27.449 14:19:23 -- common/autotest_common.sh@1492 -- # basename /sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 00:06:27.449 14:19:23 -- common/autotest_common.sh@1492 -- # printf '%s\n' nvme0 00:06:27.449 14:19:23 -- common/autotest_common.sh@1525 -- # nvme_ctrlr=/dev/nvme0 00:06:27.449 14:19:23 -- common/autotest_common.sh@1526 -- # [[ -z /dev/nvme0 ]] 00:06:27.449 14:19:23 -- common/autotest_common.sh@1531 -- # nvme id-ctrl /dev/nvme0 00:06:27.449 14:19:23 -- common/autotest_common.sh@1531 -- # grep oacs 00:06:27.449 14:19:23 -- common/autotest_common.sh@1531 -- # cut -d: -f2 00:06:27.449 14:19:23 -- common/autotest_common.sh@1531 -- # oacs=' 0xe' 00:06:27.449 14:19:23 -- common/autotest_common.sh@1532 -- # oacs_ns_manage=8 00:06:27.449 14:19:23 -- common/autotest_common.sh@1534 -- # [[ 8 -ne 0 ]] 00:06:27.449 14:19:23 -- common/autotest_common.sh@1540 -- # nvme id-ctrl /dev/nvme0 00:06:27.449 14:19:23 -- common/autotest_common.sh@1540 -- # grep unvmcap 00:06:27.449 14:19:23 -- common/autotest_common.sh@1540 -- # cut -d: -f2 00:06:27.449 14:19:23 -- common/autotest_common.sh@1540 -- # unvmcap=' 0' 00:06:27.449 14:19:23 -- common/autotest_common.sh@1541 -- # [[ 0 -eq 0 ]] 00:06:27.449 14:19:23 -- common/autotest_common.sh@1543 -- # continue 00:06:27.449 14:19:23 -- spdk/autotest.sh@122 -- # timing_exit pre_cleanup 00:06:27.449 14:19:23 -- common/autotest_common.sh@732 -- # xtrace_disable 00:06:27.449 14:19:23 -- common/autotest_common.sh@10 -- # set +x 00:06:27.449 14:19:23 -- spdk/autotest.sh@125 -- # timing_enter afterboot 00:06:27.449 14:19:23 -- common/autotest_common.sh@726 -- # xtrace_disable 00:06:27.449 14:19:23 -- common/autotest_common.sh@10 -- # set +x 00:06:27.449 14:19:23 -- spdk/autotest.sh@126 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:06:30.750 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:06:30.750 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:06:30.750 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:06:30.750 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:06:30.750 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:06:30.750 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:06:31.010 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:06:31.010 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:06:31.010 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:06:31.010 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:06:31.010 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:06:31.010 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:06:31.010 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:06:31.010 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:06:31.010 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:06:31.010 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:06:32.392 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:06:32.652 14:19:28 -- spdk/autotest.sh@127 -- # timing_exit afterboot 00:06:32.653 14:19:28 -- common/autotest_common.sh@732 -- # xtrace_disable 00:06:32.653 14:19:28 -- common/autotest_common.sh@10 -- # set +x 00:06:32.653 14:19:28 -- spdk/autotest.sh@131 -- # opal_revert_cleanup 00:06:32.653 14:19:28 -- common/autotest_common.sh@1578 -- # mapfile -t bdfs 00:06:32.653 14:19:28 -- common/autotest_common.sh@1578 -- # get_nvme_bdfs_by_id 0x0a54 00:06:32.653 14:19:28 -- common/autotest_common.sh@1563 -- # bdfs=() 00:06:32.653 14:19:28 -- common/autotest_common.sh@1563 -- # _bdfs=() 00:06:32.653 14:19:28 -- common/autotest_common.sh@1563 -- # local bdfs _bdfs 00:06:32.653 14:19:28 -- common/autotest_common.sh@1564 -- # _bdfs=($(get_nvme_bdfs)) 00:06:32.653 14:19:28 -- common/autotest_common.sh@1564 -- # get_nvme_bdfs 00:06:32.653 14:19:28 -- common/autotest_common.sh@1498 -- # bdfs=() 00:06:32.653 14:19:28 -- common/autotest_common.sh@1498 -- # local bdfs 00:06:32.653 14:19:28 -- common/autotest_common.sh@1499 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:06:32.653 14:19:28 -- common/autotest_common.sh@1499 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/gen_nvme.sh 00:06:32.653 14:19:28 -- common/autotest_common.sh@1499 -- # jq -r '.config[].params.traddr' 00:06:32.913 14:19:28 -- common/autotest_common.sh@1500 -- # (( 1 == 0 )) 00:06:32.913 14:19:28 -- common/autotest_common.sh@1504 -- # printf '%s\n' 0000:d8:00.0 00:06:32.913 14:19:28 -- common/autotest_common.sh@1565 -- # for bdf in "${_bdfs[@]}" 00:06:32.913 14:19:28 -- common/autotest_common.sh@1566 -- # cat /sys/bus/pci/devices/0000:d8:00.0/device 00:06:32.913 14:19:28 -- common/autotest_common.sh@1566 -- # device=0x0a54 00:06:32.913 14:19:28 -- common/autotest_common.sh@1567 -- # [[ 0x0a54 == \0\x\0\a\5\4 ]] 00:06:32.913 14:19:28 -- common/autotest_common.sh@1568 -- # bdfs+=($bdf) 00:06:32.913 14:19:28 -- common/autotest_common.sh@1572 -- # (( 1 > 0 )) 00:06:32.913 14:19:28 -- common/autotest_common.sh@1573 -- # printf '%s\n' 0000:d8:00.0 00:06:32.913 14:19:28 -- common/autotest_common.sh@1579 -- # [[ -z 0000:d8:00.0 ]] 00:06:32.913 14:19:28 -- common/autotest_common.sh@1584 -- # spdk_tgt_pid=306623 00:06:32.913 14:19:28 -- common/autotest_common.sh@1585 -- # waitforlisten 306623 00:06:32.913 14:19:28 -- common/autotest_common.sh@1583 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:32.913 14:19:28 -- common/autotest_common.sh@835 -- # '[' -z 306623 ']' 00:06:32.913 14:19:28 -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:32.913 14:19:28 -- common/autotest_common.sh@840 -- # local max_retries=100 00:06:32.913 14:19:28 -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:32.913 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:32.913 14:19:28 -- common/autotest_common.sh@844 -- # xtrace_disable 00:06:32.913 14:19:28 -- common/autotest_common.sh@10 -- # set +x 00:06:32.913 [2024-11-18 14:19:28.822340] Starting SPDK v25.01-pre git sha1 d47eb51c9 / DPDK 23.11.0 initialization... 00:06:32.913 [2024-11-18 14:19:28.822401] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid306623 ] 00:06:32.913 [2024-11-18 14:19:28.909123] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:32.913 [2024-11-18 14:19:28.933014] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:06:33.173 14:19:29 -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:06:33.173 14:19:29 -- common/autotest_common.sh@868 -- # return 0 00:06:33.173 14:19:29 -- common/autotest_common.sh@1587 -- # bdf_id=0 00:06:33.173 14:19:29 -- common/autotest_common.sh@1588 -- # for bdf in "${bdfs[@]}" 00:06:33.173 14:19:29 -- common/autotest_common.sh@1589 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py bdev_nvme_attach_controller -b nvme0 -t pcie -a 0000:d8:00.0 00:06:36.467 nvme0n1 00:06:36.467 14:19:32 -- common/autotest_common.sh@1591 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py bdev_nvme_opal_revert -b nvme0 -p test 00:06:36.467 [2024-11-18 14:19:32.351016] vbdev_opal_rpc.c: 125:rpc_bdev_nvme_opal_revert: *ERROR*: nvme0 not support opal 00:06:36.467 request: 00:06:36.467 { 00:06:36.467 "nvme_ctrlr_name": "nvme0", 00:06:36.467 "password": "test", 00:06:36.467 "method": "bdev_nvme_opal_revert", 00:06:36.467 "req_id": 1 00:06:36.467 } 00:06:36.467 Got JSON-RPC error response 00:06:36.467 response: 00:06:36.467 { 00:06:36.467 "code": -32602, 00:06:36.467 "message": "Invalid parameters" 00:06:36.467 } 00:06:36.467 14:19:32 -- common/autotest_common.sh@1591 -- # true 00:06:36.467 14:19:32 -- common/autotest_common.sh@1592 -- # (( ++bdf_id )) 00:06:36.467 14:19:32 -- common/autotest_common.sh@1595 -- # killprocess 306623 00:06:36.467 14:19:32 -- common/autotest_common.sh@954 -- # '[' -z 306623 ']' 00:06:36.467 14:19:32 -- common/autotest_common.sh@958 -- # kill -0 306623 00:06:36.467 14:19:32 -- common/autotest_common.sh@959 -- # uname 00:06:36.467 14:19:32 -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:06:36.467 14:19:32 -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 306623 00:06:36.467 14:19:32 -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:06:36.467 14:19:32 -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:06:36.467 14:19:32 -- common/autotest_common.sh@972 -- # echo 'killing process with pid 306623' 00:06:36.467 killing process with pid 306623 00:06:36.467 14:19:32 -- common/autotest_common.sh@973 -- # kill 306623 00:06:36.467 14:19:32 -- common/autotest_common.sh@978 -- # wait 306623 00:06:39.008 14:19:34 -- spdk/autotest.sh@137 -- # '[' 0 -eq 1 ']' 00:06:39.008 14:19:34 -- spdk/autotest.sh@141 -- # '[' 1 -eq 1 ']' 00:06:39.008 14:19:34 -- spdk/autotest.sh@142 -- # [[ 0 -eq 1 ]] 00:06:39.008 14:19:34 -- spdk/autotest.sh@142 -- # [[ 0 -eq 1 ]] 00:06:39.008 14:19:34 -- spdk/autotest.sh@149 -- # timing_enter lib 00:06:39.008 14:19:34 -- common/autotest_common.sh@726 -- # xtrace_disable 00:06:39.008 14:19:34 -- common/autotest_common.sh@10 -- # set +x 00:06:39.008 14:19:34 -- spdk/autotest.sh@151 -- # [[ 0 -eq 1 ]] 00:06:39.008 14:19:34 -- spdk/autotest.sh@155 -- # run_test env /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env.sh 00:06:39.008 14:19:34 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:39.008 14:19:34 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:39.008 14:19:34 -- common/autotest_common.sh@10 -- # set +x 00:06:39.008 ************************************ 00:06:39.008 START TEST env 00:06:39.008 ************************************ 00:06:39.008 14:19:34 env -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env.sh 00:06:39.008 * Looking for test storage... 00:06:39.008 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env 00:06:39.008 14:19:34 env -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:06:39.008 14:19:34 env -- common/autotest_common.sh@1693 -- # lcov --version 00:06:39.008 14:19:34 env -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:06:39.008 14:19:34 env -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:06:39.008 14:19:34 env -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:06:39.008 14:19:34 env -- scripts/common.sh@333 -- # local ver1 ver1_l 00:06:39.008 14:19:34 env -- scripts/common.sh@334 -- # local ver2 ver2_l 00:06:39.008 14:19:34 env -- scripts/common.sh@336 -- # IFS=.-: 00:06:39.008 14:19:34 env -- scripts/common.sh@336 -- # read -ra ver1 00:06:39.008 14:19:34 env -- scripts/common.sh@337 -- # IFS=.-: 00:06:39.008 14:19:34 env -- scripts/common.sh@337 -- # read -ra ver2 00:06:39.008 14:19:34 env -- scripts/common.sh@338 -- # local 'op=<' 00:06:39.008 14:19:34 env -- scripts/common.sh@340 -- # ver1_l=2 00:06:39.008 14:19:34 env -- scripts/common.sh@341 -- # ver2_l=1 00:06:39.008 14:19:34 env -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:06:39.008 14:19:34 env -- scripts/common.sh@344 -- # case "$op" in 00:06:39.008 14:19:34 env -- scripts/common.sh@345 -- # : 1 00:06:39.008 14:19:34 env -- scripts/common.sh@364 -- # (( v = 0 )) 00:06:39.008 14:19:34 env -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:39.008 14:19:34 env -- scripts/common.sh@365 -- # decimal 1 00:06:39.008 14:19:34 env -- scripts/common.sh@353 -- # local d=1 00:06:39.008 14:19:34 env -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:39.008 14:19:34 env -- scripts/common.sh@355 -- # echo 1 00:06:39.008 14:19:34 env -- scripts/common.sh@365 -- # ver1[v]=1 00:06:39.008 14:19:34 env -- scripts/common.sh@366 -- # decimal 2 00:06:39.008 14:19:34 env -- scripts/common.sh@353 -- # local d=2 00:06:39.008 14:19:34 env -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:39.008 14:19:34 env -- scripts/common.sh@355 -- # echo 2 00:06:39.008 14:19:34 env -- scripts/common.sh@366 -- # ver2[v]=2 00:06:39.008 14:19:34 env -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:06:39.008 14:19:34 env -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:06:39.008 14:19:34 env -- scripts/common.sh@368 -- # return 0 00:06:39.008 14:19:34 env -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:39.008 14:19:34 env -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:06:39.008 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:39.008 --rc genhtml_branch_coverage=1 00:06:39.008 --rc genhtml_function_coverage=1 00:06:39.008 --rc genhtml_legend=1 00:06:39.008 --rc geninfo_all_blocks=1 00:06:39.008 --rc geninfo_unexecuted_blocks=1 00:06:39.008 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:39.008 ' 00:06:39.008 14:19:34 env -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:06:39.008 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:39.008 --rc genhtml_branch_coverage=1 00:06:39.008 --rc genhtml_function_coverage=1 00:06:39.008 --rc genhtml_legend=1 00:06:39.008 --rc geninfo_all_blocks=1 00:06:39.008 --rc geninfo_unexecuted_blocks=1 00:06:39.008 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:39.008 ' 00:06:39.008 14:19:34 env -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:06:39.008 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:39.008 --rc genhtml_branch_coverage=1 00:06:39.008 --rc genhtml_function_coverage=1 00:06:39.008 --rc genhtml_legend=1 00:06:39.008 --rc geninfo_all_blocks=1 00:06:39.008 --rc geninfo_unexecuted_blocks=1 00:06:39.008 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:39.008 ' 00:06:39.008 14:19:34 env -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:06:39.008 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:39.008 --rc genhtml_branch_coverage=1 00:06:39.008 --rc genhtml_function_coverage=1 00:06:39.008 --rc genhtml_legend=1 00:06:39.008 --rc geninfo_all_blocks=1 00:06:39.008 --rc geninfo_unexecuted_blocks=1 00:06:39.008 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:39.008 ' 00:06:39.008 14:19:34 env -- env/env.sh@10 -- # run_test env_memory /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/memory/memory_ut 00:06:39.008 14:19:34 env -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:39.008 14:19:34 env -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:39.008 14:19:34 env -- common/autotest_common.sh@10 -- # set +x 00:06:39.008 ************************************ 00:06:39.008 START TEST env_memory 00:06:39.008 ************************************ 00:06:39.008 14:19:34 env.env_memory -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/memory/memory_ut 00:06:39.008 00:06:39.008 00:06:39.008 CUnit - A unit testing framework for C - Version 2.1-3 00:06:39.008 http://cunit.sourceforge.net/ 00:06:39.008 00:06:39.008 00:06:39.008 Suite: memory 00:06:39.008 Test: alloc and free memory map ...[2024-11-18 14:19:34.860617] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 283:spdk_mem_map_alloc: *ERROR*: Initial mem_map notify failed 00:06:39.008 passed 00:06:39.008 Test: mem map translation ...[2024-11-18 14:19:34.873373] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 596:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=2097152 len=1234 00:06:39.008 [2024-11-18 14:19:34.873389] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 596:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=1234 len=2097152 00:06:39.008 [2024-11-18 14:19:34.873419] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 589:spdk_mem_map_set_translation: *ERROR*: invalid usermode virtual address 281474976710656 00:06:39.008 [2024-11-18 14:19:34.873432] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 605:spdk_mem_map_set_translation: *ERROR*: could not get 0xffffffe00000 map 00:06:39.008 passed 00:06:39.008 Test: mem map registration ...[2024-11-18 14:19:34.893775] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 348:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=200000 len=1234 00:06:39.008 [2024-11-18 14:19:34.893800] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 348:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=4d2 len=2097152 00:06:39.008 passed 00:06:39.008 Test: mem map adjacent registrations ...passed 00:06:39.008 00:06:39.008 Run Summary: Type Total Ran Passed Failed Inactive 00:06:39.008 suites 1 1 n/a 0 0 00:06:39.008 tests 4 4 4 0 0 00:06:39.008 asserts 152 152 152 0 n/a 00:06:39.008 00:06:39.008 Elapsed time = 0.084 seconds 00:06:39.008 00:06:39.008 real 0m0.097s 00:06:39.008 user 0m0.081s 00:06:39.008 sys 0m0.016s 00:06:39.008 14:19:34 env.env_memory -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:39.008 14:19:34 env.env_memory -- common/autotest_common.sh@10 -- # set +x 00:06:39.008 ************************************ 00:06:39.008 END TEST env_memory 00:06:39.008 ************************************ 00:06:39.009 14:19:34 env -- env/env.sh@11 -- # run_test env_vtophys /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/vtophys/vtophys 00:06:39.009 14:19:34 env -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:39.009 14:19:34 env -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:39.009 14:19:34 env -- common/autotest_common.sh@10 -- # set +x 00:06:39.009 ************************************ 00:06:39.009 START TEST env_vtophys 00:06:39.009 ************************************ 00:06:39.009 14:19:35 env.env_vtophys -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/vtophys/vtophys 00:06:39.009 EAL: lib.eal log level changed from notice to debug 00:06:39.009 EAL: Detected lcore 0 as core 0 on socket 0 00:06:39.009 EAL: Detected lcore 1 as core 1 on socket 0 00:06:39.009 EAL: Detected lcore 2 as core 2 on socket 0 00:06:39.009 EAL: Detected lcore 3 as core 3 on socket 0 00:06:39.009 EAL: Detected lcore 4 as core 4 on socket 0 00:06:39.009 EAL: Detected lcore 5 as core 5 on socket 0 00:06:39.009 EAL: Detected lcore 6 as core 6 on socket 0 00:06:39.009 EAL: Detected lcore 7 as core 8 on socket 0 00:06:39.009 EAL: Detected lcore 8 as core 9 on socket 0 00:06:39.009 EAL: Detected lcore 9 as core 10 on socket 0 00:06:39.009 EAL: Detected lcore 10 as core 11 on socket 0 00:06:39.009 EAL: Detected lcore 11 as core 12 on socket 0 00:06:39.009 EAL: Detected lcore 12 as core 13 on socket 0 00:06:39.009 EAL: Detected lcore 13 as core 14 on socket 0 00:06:39.009 EAL: Detected lcore 14 as core 16 on socket 0 00:06:39.009 EAL: Detected lcore 15 as core 17 on socket 0 00:06:39.009 EAL: Detected lcore 16 as core 18 on socket 0 00:06:39.009 EAL: Detected lcore 17 as core 19 on socket 0 00:06:39.009 EAL: Detected lcore 18 as core 20 on socket 0 00:06:39.009 EAL: Detected lcore 19 as core 21 on socket 0 00:06:39.009 EAL: Detected lcore 20 as core 22 on socket 0 00:06:39.009 EAL: Detected lcore 21 as core 24 on socket 0 00:06:39.009 EAL: Detected lcore 22 as core 25 on socket 0 00:06:39.009 EAL: Detected lcore 23 as core 26 on socket 0 00:06:39.009 EAL: Detected lcore 24 as core 27 on socket 0 00:06:39.009 EAL: Detected lcore 25 as core 28 on socket 0 00:06:39.009 EAL: Detected lcore 26 as core 29 on socket 0 00:06:39.009 EAL: Detected lcore 27 as core 30 on socket 0 00:06:39.009 EAL: Detected lcore 28 as core 0 on socket 1 00:06:39.009 EAL: Detected lcore 29 as core 1 on socket 1 00:06:39.009 EAL: Detected lcore 30 as core 2 on socket 1 00:06:39.009 EAL: Detected lcore 31 as core 3 on socket 1 00:06:39.009 EAL: Detected lcore 32 as core 4 on socket 1 00:06:39.009 EAL: Detected lcore 33 as core 5 on socket 1 00:06:39.009 EAL: Detected lcore 34 as core 6 on socket 1 00:06:39.009 EAL: Detected lcore 35 as core 8 on socket 1 00:06:39.009 EAL: Detected lcore 36 as core 9 on socket 1 00:06:39.009 EAL: Detected lcore 37 as core 10 on socket 1 00:06:39.009 EAL: Detected lcore 38 as core 11 on socket 1 00:06:39.009 EAL: Detected lcore 39 as core 12 on socket 1 00:06:39.009 EAL: Detected lcore 40 as core 13 on socket 1 00:06:39.009 EAL: Detected lcore 41 as core 14 on socket 1 00:06:39.009 EAL: Detected lcore 42 as core 16 on socket 1 00:06:39.009 EAL: Detected lcore 43 as core 17 on socket 1 00:06:39.009 EAL: Detected lcore 44 as core 18 on socket 1 00:06:39.009 EAL: Detected lcore 45 as core 19 on socket 1 00:06:39.009 EAL: Detected lcore 46 as core 20 on socket 1 00:06:39.009 EAL: Detected lcore 47 as core 21 on socket 1 00:06:39.009 EAL: Detected lcore 48 as core 22 on socket 1 00:06:39.009 EAL: Detected lcore 49 as core 24 on socket 1 00:06:39.009 EAL: Detected lcore 50 as core 25 on socket 1 00:06:39.009 EAL: Detected lcore 51 as core 26 on socket 1 00:06:39.009 EAL: Detected lcore 52 as core 27 on socket 1 00:06:39.009 EAL: Detected lcore 53 as core 28 on socket 1 00:06:39.009 EAL: Detected lcore 54 as core 29 on socket 1 00:06:39.009 EAL: Detected lcore 55 as core 30 on socket 1 00:06:39.009 EAL: Detected lcore 56 as core 0 on socket 0 00:06:39.009 EAL: Detected lcore 57 as core 1 on socket 0 00:06:39.009 EAL: Detected lcore 58 as core 2 on socket 0 00:06:39.009 EAL: Detected lcore 59 as core 3 on socket 0 00:06:39.009 EAL: Detected lcore 60 as core 4 on socket 0 00:06:39.009 EAL: Detected lcore 61 as core 5 on socket 0 00:06:39.009 EAL: Detected lcore 62 as core 6 on socket 0 00:06:39.009 EAL: Detected lcore 63 as core 8 on socket 0 00:06:39.009 EAL: Detected lcore 64 as core 9 on socket 0 00:06:39.009 EAL: Detected lcore 65 as core 10 on socket 0 00:06:39.009 EAL: Detected lcore 66 as core 11 on socket 0 00:06:39.009 EAL: Detected lcore 67 as core 12 on socket 0 00:06:39.009 EAL: Detected lcore 68 as core 13 on socket 0 00:06:39.009 EAL: Detected lcore 69 as core 14 on socket 0 00:06:39.009 EAL: Detected lcore 70 as core 16 on socket 0 00:06:39.009 EAL: Detected lcore 71 as core 17 on socket 0 00:06:39.009 EAL: Detected lcore 72 as core 18 on socket 0 00:06:39.009 EAL: Detected lcore 73 as core 19 on socket 0 00:06:39.009 EAL: Detected lcore 74 as core 20 on socket 0 00:06:39.009 EAL: Detected lcore 75 as core 21 on socket 0 00:06:39.009 EAL: Detected lcore 76 as core 22 on socket 0 00:06:39.009 EAL: Detected lcore 77 as core 24 on socket 0 00:06:39.009 EAL: Detected lcore 78 as core 25 on socket 0 00:06:39.009 EAL: Detected lcore 79 as core 26 on socket 0 00:06:39.009 EAL: Detected lcore 80 as core 27 on socket 0 00:06:39.009 EAL: Detected lcore 81 as core 28 on socket 0 00:06:39.009 EAL: Detected lcore 82 as core 29 on socket 0 00:06:39.009 EAL: Detected lcore 83 as core 30 on socket 0 00:06:39.009 EAL: Detected lcore 84 as core 0 on socket 1 00:06:39.009 EAL: Detected lcore 85 as core 1 on socket 1 00:06:39.009 EAL: Detected lcore 86 as core 2 on socket 1 00:06:39.009 EAL: Detected lcore 87 as core 3 on socket 1 00:06:39.009 EAL: Detected lcore 88 as core 4 on socket 1 00:06:39.009 EAL: Detected lcore 89 as core 5 on socket 1 00:06:39.009 EAL: Detected lcore 90 as core 6 on socket 1 00:06:39.009 EAL: Detected lcore 91 as core 8 on socket 1 00:06:39.009 EAL: Detected lcore 92 as core 9 on socket 1 00:06:39.009 EAL: Detected lcore 93 as core 10 on socket 1 00:06:39.009 EAL: Detected lcore 94 as core 11 on socket 1 00:06:39.009 EAL: Detected lcore 95 as core 12 on socket 1 00:06:39.009 EAL: Detected lcore 96 as core 13 on socket 1 00:06:39.009 EAL: Detected lcore 97 as core 14 on socket 1 00:06:39.009 EAL: Detected lcore 98 as core 16 on socket 1 00:06:39.009 EAL: Detected lcore 99 as core 17 on socket 1 00:06:39.009 EAL: Detected lcore 100 as core 18 on socket 1 00:06:39.009 EAL: Detected lcore 101 as core 19 on socket 1 00:06:39.009 EAL: Detected lcore 102 as core 20 on socket 1 00:06:39.009 EAL: Detected lcore 103 as core 21 on socket 1 00:06:39.009 EAL: Detected lcore 104 as core 22 on socket 1 00:06:39.009 EAL: Detected lcore 105 as core 24 on socket 1 00:06:39.009 EAL: Detected lcore 106 as core 25 on socket 1 00:06:39.009 EAL: Detected lcore 107 as core 26 on socket 1 00:06:39.009 EAL: Detected lcore 108 as core 27 on socket 1 00:06:39.009 EAL: Detected lcore 109 as core 28 on socket 1 00:06:39.009 EAL: Detected lcore 110 as core 29 on socket 1 00:06:39.009 EAL: Detected lcore 111 as core 30 on socket 1 00:06:39.009 EAL: Maximum logical cores by configuration: 128 00:06:39.009 EAL: Detected CPU lcores: 112 00:06:39.009 EAL: Detected NUMA nodes: 2 00:06:39.009 EAL: Checking presence of .so 'librte_eal.so.24.0' 00:06:39.009 EAL: Checking presence of .so 'librte_eal.so.24' 00:06:39.009 EAL: Checking presence of .so 'librte_eal.so' 00:06:39.009 EAL: Detected static linkage of DPDK 00:06:39.009 EAL: No shared files mode enabled, IPC will be disabled 00:06:39.009 EAL: Bus pci wants IOVA as 'DC' 00:06:39.009 EAL: Buses did not request a specific IOVA mode. 00:06:39.009 EAL: IOMMU is available, selecting IOVA as VA mode. 00:06:39.009 EAL: Selected IOVA mode 'VA' 00:06:39.009 EAL: Probing VFIO support... 00:06:39.009 EAL: IOMMU type 1 (Type 1) is supported 00:06:39.009 EAL: IOMMU type 7 (sPAPR) is not supported 00:06:39.009 EAL: IOMMU type 8 (No-IOMMU) is not supported 00:06:39.009 EAL: VFIO support initialized 00:06:39.009 EAL: Ask a virtual area of 0x2e000 bytes 00:06:39.009 EAL: Virtual area found at 0x200000000000 (size = 0x2e000) 00:06:39.009 EAL: Setting up physically contiguous memory... 00:06:39.009 EAL: Setting maximum number of open files to 524288 00:06:39.009 EAL: Detected memory type: socket_id:0 hugepage_sz:2097152 00:06:39.009 EAL: Detected memory type: socket_id:1 hugepage_sz:2097152 00:06:39.009 EAL: Creating 4 segment lists: n_segs:8192 socket_id:0 hugepage_sz:2097152 00:06:39.009 EAL: Ask a virtual area of 0x61000 bytes 00:06:39.009 EAL: Virtual area found at 0x20000002e000 (size = 0x61000) 00:06:39.009 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:06:39.009 EAL: Ask a virtual area of 0x400000000 bytes 00:06:39.009 EAL: Virtual area found at 0x200000200000 (size = 0x400000000) 00:06:39.009 EAL: VA reserved for memseg list at 0x200000200000, size 400000000 00:06:39.009 EAL: Ask a virtual area of 0x61000 bytes 00:06:39.009 EAL: Virtual area found at 0x200400200000 (size = 0x61000) 00:06:39.009 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:06:39.009 EAL: Ask a virtual area of 0x400000000 bytes 00:06:39.009 EAL: Virtual area found at 0x200400400000 (size = 0x400000000) 00:06:39.009 EAL: VA reserved for memseg list at 0x200400400000, size 400000000 00:06:39.009 EAL: Ask a virtual area of 0x61000 bytes 00:06:39.009 EAL: Virtual area found at 0x200800400000 (size = 0x61000) 00:06:39.009 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:06:39.010 EAL: Ask a virtual area of 0x400000000 bytes 00:06:39.010 EAL: Virtual area found at 0x200800600000 (size = 0x400000000) 00:06:39.010 EAL: VA reserved for memseg list at 0x200800600000, size 400000000 00:06:39.010 EAL: Ask a virtual area of 0x61000 bytes 00:06:39.010 EAL: Virtual area found at 0x200c00600000 (size = 0x61000) 00:06:39.010 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:06:39.010 EAL: Ask a virtual area of 0x400000000 bytes 00:06:39.010 EAL: Virtual area found at 0x200c00800000 (size = 0x400000000) 00:06:39.010 EAL: VA reserved for memseg list at 0x200c00800000, size 400000000 00:06:39.010 EAL: Creating 4 segment lists: n_segs:8192 socket_id:1 hugepage_sz:2097152 00:06:39.010 EAL: Ask a virtual area of 0x61000 bytes 00:06:39.010 EAL: Virtual area found at 0x201000800000 (size = 0x61000) 00:06:39.010 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:06:39.010 EAL: Ask a virtual area of 0x400000000 bytes 00:06:39.010 EAL: Virtual area found at 0x201000a00000 (size = 0x400000000) 00:06:39.010 EAL: VA reserved for memseg list at 0x201000a00000, size 400000000 00:06:39.010 EAL: Ask a virtual area of 0x61000 bytes 00:06:39.010 EAL: Virtual area found at 0x201400a00000 (size = 0x61000) 00:06:39.010 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:06:39.010 EAL: Ask a virtual area of 0x400000000 bytes 00:06:39.010 EAL: Virtual area found at 0x201400c00000 (size = 0x400000000) 00:06:39.010 EAL: VA reserved for memseg list at 0x201400c00000, size 400000000 00:06:39.010 EAL: Ask a virtual area of 0x61000 bytes 00:06:39.010 EAL: Virtual area found at 0x201800c00000 (size = 0x61000) 00:06:39.010 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:06:39.010 EAL: Ask a virtual area of 0x400000000 bytes 00:06:39.010 EAL: Virtual area found at 0x201800e00000 (size = 0x400000000) 00:06:39.010 EAL: VA reserved for memseg list at 0x201800e00000, size 400000000 00:06:39.010 EAL: Ask a virtual area of 0x61000 bytes 00:06:39.010 EAL: Virtual area found at 0x201c00e00000 (size = 0x61000) 00:06:39.010 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:06:39.010 EAL: Ask a virtual area of 0x400000000 bytes 00:06:39.010 EAL: Virtual area found at 0x201c01000000 (size = 0x400000000) 00:06:39.010 EAL: VA reserved for memseg list at 0x201c01000000, size 400000000 00:06:39.010 EAL: Hugepages will be freed exactly as allocated. 00:06:39.010 EAL: No shared files mode enabled, IPC is disabled 00:06:39.010 EAL: No shared files mode enabled, IPC is disabled 00:06:39.010 EAL: TSC frequency is ~2500000 KHz 00:06:39.010 EAL: Main lcore 0 is ready (tid=7f49d3dcca00;cpuset=[0]) 00:06:39.010 EAL: Trying to obtain current memory policy. 00:06:39.010 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:39.010 EAL: Restoring previous memory policy: 0 00:06:39.010 EAL: request: mp_malloc_sync 00:06:39.010 EAL: No shared files mode enabled, IPC is disabled 00:06:39.010 EAL: Heap on socket 0 was expanded by 2MB 00:06:39.010 EAL: No shared files mode enabled, IPC is disabled 00:06:39.010 EAL: Mem event callback 'spdk:(nil)' registered 00:06:39.010 00:06:39.010 00:06:39.010 CUnit - A unit testing framework for C - Version 2.1-3 00:06:39.010 http://cunit.sourceforge.net/ 00:06:39.010 00:06:39.010 00:06:39.010 Suite: components_suite 00:06:39.010 Test: vtophys_malloc_test ...passed 00:06:39.010 Test: vtophys_spdk_malloc_test ...EAL: Trying to obtain current memory policy. 00:06:39.010 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:39.010 EAL: Restoring previous memory policy: 4 00:06:39.010 EAL: Calling mem event callback 'spdk:(nil)' 00:06:39.010 EAL: request: mp_malloc_sync 00:06:39.010 EAL: No shared files mode enabled, IPC is disabled 00:06:39.010 EAL: Heap on socket 0 was expanded by 4MB 00:06:39.010 EAL: Calling mem event callback 'spdk:(nil)' 00:06:39.010 EAL: request: mp_malloc_sync 00:06:39.010 EAL: No shared files mode enabled, IPC is disabled 00:06:39.010 EAL: Heap on socket 0 was shrunk by 4MB 00:06:39.010 EAL: Trying to obtain current memory policy. 00:06:39.010 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:39.010 EAL: Restoring previous memory policy: 4 00:06:39.010 EAL: Calling mem event callback 'spdk:(nil)' 00:06:39.010 EAL: request: mp_malloc_sync 00:06:39.010 EAL: No shared files mode enabled, IPC is disabled 00:06:39.010 EAL: Heap on socket 0 was expanded by 6MB 00:06:39.010 EAL: Calling mem event callback 'spdk:(nil)' 00:06:39.010 EAL: request: mp_malloc_sync 00:06:39.010 EAL: No shared files mode enabled, IPC is disabled 00:06:39.010 EAL: Heap on socket 0 was shrunk by 6MB 00:06:39.010 EAL: Trying to obtain current memory policy. 00:06:39.010 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:39.010 EAL: Restoring previous memory policy: 4 00:06:39.010 EAL: Calling mem event callback 'spdk:(nil)' 00:06:39.010 EAL: request: mp_malloc_sync 00:06:39.010 EAL: No shared files mode enabled, IPC is disabled 00:06:39.010 EAL: Heap on socket 0 was expanded by 10MB 00:06:39.010 EAL: Calling mem event callback 'spdk:(nil)' 00:06:39.010 EAL: request: mp_malloc_sync 00:06:39.010 EAL: No shared files mode enabled, IPC is disabled 00:06:39.010 EAL: Heap on socket 0 was shrunk by 10MB 00:06:39.010 EAL: Trying to obtain current memory policy. 00:06:39.010 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:39.010 EAL: Restoring previous memory policy: 4 00:06:39.010 EAL: Calling mem event callback 'spdk:(nil)' 00:06:39.010 EAL: request: mp_malloc_sync 00:06:39.010 EAL: No shared files mode enabled, IPC is disabled 00:06:39.010 EAL: Heap on socket 0 was expanded by 18MB 00:06:39.010 EAL: Calling mem event callback 'spdk:(nil)' 00:06:39.010 EAL: request: mp_malloc_sync 00:06:39.010 EAL: No shared files mode enabled, IPC is disabled 00:06:39.010 EAL: Heap on socket 0 was shrunk by 18MB 00:06:39.010 EAL: Trying to obtain current memory policy. 00:06:39.010 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:39.010 EAL: Restoring previous memory policy: 4 00:06:39.010 EAL: Calling mem event callback 'spdk:(nil)' 00:06:39.010 EAL: request: mp_malloc_sync 00:06:39.010 EAL: No shared files mode enabled, IPC is disabled 00:06:39.010 EAL: Heap on socket 0 was expanded by 34MB 00:06:39.270 EAL: Calling mem event callback 'spdk:(nil)' 00:06:39.270 EAL: request: mp_malloc_sync 00:06:39.270 EAL: No shared files mode enabled, IPC is disabled 00:06:39.270 EAL: Heap on socket 0 was shrunk by 34MB 00:06:39.270 EAL: Trying to obtain current memory policy. 00:06:39.270 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:39.270 EAL: Restoring previous memory policy: 4 00:06:39.270 EAL: Calling mem event callback 'spdk:(nil)' 00:06:39.270 EAL: request: mp_malloc_sync 00:06:39.271 EAL: No shared files mode enabled, IPC is disabled 00:06:39.271 EAL: Heap on socket 0 was expanded by 66MB 00:06:39.271 EAL: Calling mem event callback 'spdk:(nil)' 00:06:39.271 EAL: request: mp_malloc_sync 00:06:39.271 EAL: No shared files mode enabled, IPC is disabled 00:06:39.271 EAL: Heap on socket 0 was shrunk by 66MB 00:06:39.271 EAL: Trying to obtain current memory policy. 00:06:39.271 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:39.271 EAL: Restoring previous memory policy: 4 00:06:39.271 EAL: Calling mem event callback 'spdk:(nil)' 00:06:39.271 EAL: request: mp_malloc_sync 00:06:39.271 EAL: No shared files mode enabled, IPC is disabled 00:06:39.271 EAL: Heap on socket 0 was expanded by 130MB 00:06:39.271 EAL: Calling mem event callback 'spdk:(nil)' 00:06:39.271 EAL: request: mp_malloc_sync 00:06:39.271 EAL: No shared files mode enabled, IPC is disabled 00:06:39.271 EAL: Heap on socket 0 was shrunk by 130MB 00:06:39.271 EAL: Trying to obtain current memory policy. 00:06:39.271 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:39.271 EAL: Restoring previous memory policy: 4 00:06:39.271 EAL: Calling mem event callback 'spdk:(nil)' 00:06:39.271 EAL: request: mp_malloc_sync 00:06:39.271 EAL: No shared files mode enabled, IPC is disabled 00:06:39.271 EAL: Heap on socket 0 was expanded by 258MB 00:06:39.271 EAL: Calling mem event callback 'spdk:(nil)' 00:06:39.271 EAL: request: mp_malloc_sync 00:06:39.271 EAL: No shared files mode enabled, IPC is disabled 00:06:39.271 EAL: Heap on socket 0 was shrunk by 258MB 00:06:39.271 EAL: Trying to obtain current memory policy. 00:06:39.271 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:39.531 EAL: Restoring previous memory policy: 4 00:06:39.531 EAL: Calling mem event callback 'spdk:(nil)' 00:06:39.531 EAL: request: mp_malloc_sync 00:06:39.531 EAL: No shared files mode enabled, IPC is disabled 00:06:39.531 EAL: Heap on socket 0 was expanded by 514MB 00:06:39.531 EAL: Calling mem event callback 'spdk:(nil)' 00:06:39.531 EAL: request: mp_malloc_sync 00:06:39.531 EAL: No shared files mode enabled, IPC is disabled 00:06:39.531 EAL: Heap on socket 0 was shrunk by 514MB 00:06:39.531 EAL: Trying to obtain current memory policy. 00:06:39.531 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:39.791 EAL: Restoring previous memory policy: 4 00:06:39.791 EAL: Calling mem event callback 'spdk:(nil)' 00:06:39.791 EAL: request: mp_malloc_sync 00:06:39.791 EAL: No shared files mode enabled, IPC is disabled 00:06:39.791 EAL: Heap on socket 0 was expanded by 1026MB 00:06:40.051 EAL: Calling mem event callback 'spdk:(nil)' 00:06:40.051 EAL: request: mp_malloc_sync 00:06:40.051 EAL: No shared files mode enabled, IPC is disabled 00:06:40.051 EAL: Heap on socket 0 was shrunk by 1026MB 00:06:40.051 passed 00:06:40.051 00:06:40.051 Run Summary: Type Total Ran Passed Failed Inactive 00:06:40.051 suites 1 1 n/a 0 0 00:06:40.051 tests 2 2 2 0 0 00:06:40.051 asserts 497 497 497 0 n/a 00:06:40.051 00:06:40.051 Elapsed time = 0.975 seconds 00:06:40.051 EAL: Calling mem event callback 'spdk:(nil)' 00:06:40.051 EAL: request: mp_malloc_sync 00:06:40.051 EAL: No shared files mode enabled, IPC is disabled 00:06:40.051 EAL: Heap on socket 0 was shrunk by 2MB 00:06:40.051 EAL: No shared files mode enabled, IPC is disabled 00:06:40.051 EAL: No shared files mode enabled, IPC is disabled 00:06:40.051 EAL: No shared files mode enabled, IPC is disabled 00:06:40.051 00:06:40.051 real 0m1.112s 00:06:40.051 user 0m0.630s 00:06:40.051 sys 0m0.462s 00:06:40.051 14:19:36 env.env_vtophys -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:40.051 14:19:36 env.env_vtophys -- common/autotest_common.sh@10 -- # set +x 00:06:40.051 ************************************ 00:06:40.051 END TEST env_vtophys 00:06:40.051 ************************************ 00:06:40.051 14:19:36 env -- env/env.sh@12 -- # run_test env_pci /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/pci/pci_ut 00:06:40.051 14:19:36 env -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:40.051 14:19:36 env -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:40.051 14:19:36 env -- common/autotest_common.sh@10 -- # set +x 00:06:40.312 ************************************ 00:06:40.312 START TEST env_pci 00:06:40.312 ************************************ 00:06:40.312 14:19:36 env.env_pci -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/pci/pci_ut 00:06:40.312 00:06:40.312 00:06:40.312 CUnit - A unit testing framework for C - Version 2.1-3 00:06:40.312 http://cunit.sourceforge.net/ 00:06:40.312 00:06:40.312 00:06:40.312 Suite: pci 00:06:40.312 Test: pci_hook ...[2024-11-18 14:19:36.216130] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/pci.c:1118:spdk_pci_device_claim: *ERROR*: Cannot create lock on device /var/tmp/spdk_pci_lock_10000:00:01.0, probably process 307916 has claimed it 00:06:40.312 EAL: Cannot find device (10000:00:01.0) 00:06:40.312 EAL: Failed to attach device on primary process 00:06:40.312 passed 00:06:40.312 00:06:40.312 Run Summary: Type Total Ran Passed Failed Inactive 00:06:40.312 suites 1 1 n/a 0 0 00:06:40.312 tests 1 1 1 0 0 00:06:40.312 asserts 25 25 25 0 n/a 00:06:40.312 00:06:40.312 Elapsed time = 0.037 seconds 00:06:40.312 00:06:40.312 real 0m0.057s 00:06:40.312 user 0m0.010s 00:06:40.312 sys 0m0.046s 00:06:40.312 14:19:36 env.env_pci -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:40.312 14:19:36 env.env_pci -- common/autotest_common.sh@10 -- # set +x 00:06:40.312 ************************************ 00:06:40.312 END TEST env_pci 00:06:40.312 ************************************ 00:06:40.312 14:19:36 env -- env/env.sh@14 -- # argv='-c 0x1 ' 00:06:40.312 14:19:36 env -- env/env.sh@15 -- # uname 00:06:40.312 14:19:36 env -- env/env.sh@15 -- # '[' Linux = Linux ']' 00:06:40.312 14:19:36 env -- env/env.sh@22 -- # argv+=--base-virtaddr=0x200000000000 00:06:40.312 14:19:36 env -- env/env.sh@24 -- # run_test env_dpdk_post_init /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:06:40.312 14:19:36 env -- common/autotest_common.sh@1105 -- # '[' 5 -le 1 ']' 00:06:40.312 14:19:36 env -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:40.312 14:19:36 env -- common/autotest_common.sh@10 -- # set +x 00:06:40.312 ************************************ 00:06:40.312 START TEST env_dpdk_post_init 00:06:40.312 ************************************ 00:06:40.312 14:19:36 env.env_dpdk_post_init -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:06:40.312 EAL: Detected CPU lcores: 112 00:06:40.312 EAL: Detected NUMA nodes: 2 00:06:40.312 EAL: Detected static linkage of DPDK 00:06:40.312 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:06:40.312 EAL: Selected IOVA mode 'VA' 00:06:40.312 EAL: VFIO support initialized 00:06:40.312 TELEMETRY: No legacy callbacks, legacy socket not created 00:06:40.572 EAL: Using IOMMU type 1 (Type 1) 00:06:41.144 EAL: Probe PCI driver: spdk_nvme (8086:0a54) device: 0000:d8:00.0 (socket 1) 00:06:45.339 EAL: Releasing PCI mapped resource for 0000:d8:00.0 00:06:45.339 EAL: Calling pci_unmap_resource for 0000:d8:00.0 at 0x202001000000 00:06:45.339 Starting DPDK initialization... 00:06:45.339 Starting SPDK post initialization... 00:06:45.339 SPDK NVMe probe 00:06:45.339 Attaching to 0000:d8:00.0 00:06:45.339 Attached to 0000:d8:00.0 00:06:45.339 Cleaning up... 00:06:45.339 00:06:45.339 real 0m4.690s 00:06:45.339 user 0m3.518s 00:06:45.339 sys 0m0.415s 00:06:45.339 14:19:41 env.env_dpdk_post_init -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:45.339 14:19:41 env.env_dpdk_post_init -- common/autotest_common.sh@10 -- # set +x 00:06:45.339 ************************************ 00:06:45.339 END TEST env_dpdk_post_init 00:06:45.339 ************************************ 00:06:45.339 14:19:41 env -- env/env.sh@26 -- # uname 00:06:45.339 14:19:41 env -- env/env.sh@26 -- # '[' Linux = Linux ']' 00:06:45.339 14:19:41 env -- env/env.sh@29 -- # run_test env_mem_callbacks /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:06:45.339 14:19:41 env -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:45.339 14:19:41 env -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:45.339 14:19:41 env -- common/autotest_common.sh@10 -- # set +x 00:06:45.339 ************************************ 00:06:45.339 START TEST env_mem_callbacks 00:06:45.339 ************************************ 00:06:45.339 14:19:41 env.env_mem_callbacks -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:06:45.339 EAL: Detected CPU lcores: 112 00:06:45.339 EAL: Detected NUMA nodes: 2 00:06:45.339 EAL: Detected static linkage of DPDK 00:06:45.339 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:06:45.339 EAL: Selected IOVA mode 'VA' 00:06:45.339 EAL: VFIO support initialized 00:06:45.339 TELEMETRY: No legacy callbacks, legacy socket not created 00:06:45.339 00:06:45.339 00:06:45.339 CUnit - A unit testing framework for C - Version 2.1-3 00:06:45.339 http://cunit.sourceforge.net/ 00:06:45.339 00:06:45.339 00:06:45.339 Suite: memory 00:06:45.339 Test: test ... 00:06:45.339 register 0x200000200000 2097152 00:06:45.339 malloc 3145728 00:06:45.339 register 0x200000400000 4194304 00:06:45.339 buf 0x200000500000 len 3145728 PASSED 00:06:45.339 malloc 64 00:06:45.339 buf 0x2000004fff40 len 64 PASSED 00:06:45.339 malloc 4194304 00:06:45.339 register 0x200000800000 6291456 00:06:45.339 buf 0x200000a00000 len 4194304 PASSED 00:06:45.339 free 0x200000500000 3145728 00:06:45.339 free 0x2000004fff40 64 00:06:45.339 unregister 0x200000400000 4194304 PASSED 00:06:45.339 free 0x200000a00000 4194304 00:06:45.339 unregister 0x200000800000 6291456 PASSED 00:06:45.339 malloc 8388608 00:06:45.339 register 0x200000400000 10485760 00:06:45.339 buf 0x200000600000 len 8388608 PASSED 00:06:45.339 free 0x200000600000 8388608 00:06:45.339 unregister 0x200000400000 10485760 PASSED 00:06:45.339 passed 00:06:45.339 00:06:45.339 Run Summary: Type Total Ran Passed Failed Inactive 00:06:45.339 suites 1 1 n/a 0 0 00:06:45.339 tests 1 1 1 0 0 00:06:45.339 asserts 15 15 15 0 n/a 00:06:45.339 00:06:45.339 Elapsed time = 0.008 seconds 00:06:45.339 00:06:45.339 real 0m0.069s 00:06:45.339 user 0m0.019s 00:06:45.339 sys 0m0.050s 00:06:45.339 14:19:41 env.env_mem_callbacks -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:45.339 14:19:41 env.env_mem_callbacks -- common/autotest_common.sh@10 -- # set +x 00:06:45.339 ************************************ 00:06:45.340 END TEST env_mem_callbacks 00:06:45.340 ************************************ 00:06:45.340 00:06:45.340 real 0m6.658s 00:06:45.340 user 0m4.519s 00:06:45.340 sys 0m1.410s 00:06:45.340 14:19:41 env -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:45.340 14:19:41 env -- common/autotest_common.sh@10 -- # set +x 00:06:45.340 ************************************ 00:06:45.340 END TEST env 00:06:45.340 ************************************ 00:06:45.340 14:19:41 -- spdk/autotest.sh@156 -- # run_test rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/rpc.sh 00:06:45.340 14:19:41 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:45.340 14:19:41 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:45.340 14:19:41 -- common/autotest_common.sh@10 -- # set +x 00:06:45.340 ************************************ 00:06:45.340 START TEST rpc 00:06:45.340 ************************************ 00:06:45.340 14:19:41 rpc -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/rpc.sh 00:06:45.340 * Looking for test storage... 00:06:45.340 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:06:45.340 14:19:41 rpc -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:06:45.340 14:19:41 rpc -- common/autotest_common.sh@1693 -- # lcov --version 00:06:45.340 14:19:41 rpc -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:06:45.601 14:19:41 rpc -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:06:45.601 14:19:41 rpc -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:06:45.601 14:19:41 rpc -- scripts/common.sh@333 -- # local ver1 ver1_l 00:06:45.601 14:19:41 rpc -- scripts/common.sh@334 -- # local ver2 ver2_l 00:06:45.601 14:19:41 rpc -- scripts/common.sh@336 -- # IFS=.-: 00:06:45.601 14:19:41 rpc -- scripts/common.sh@336 -- # read -ra ver1 00:06:45.601 14:19:41 rpc -- scripts/common.sh@337 -- # IFS=.-: 00:06:45.601 14:19:41 rpc -- scripts/common.sh@337 -- # read -ra ver2 00:06:45.601 14:19:41 rpc -- scripts/common.sh@338 -- # local 'op=<' 00:06:45.601 14:19:41 rpc -- scripts/common.sh@340 -- # ver1_l=2 00:06:45.601 14:19:41 rpc -- scripts/common.sh@341 -- # ver2_l=1 00:06:45.601 14:19:41 rpc -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:06:45.601 14:19:41 rpc -- scripts/common.sh@344 -- # case "$op" in 00:06:45.601 14:19:41 rpc -- scripts/common.sh@345 -- # : 1 00:06:45.601 14:19:41 rpc -- scripts/common.sh@364 -- # (( v = 0 )) 00:06:45.601 14:19:41 rpc -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:45.601 14:19:41 rpc -- scripts/common.sh@365 -- # decimal 1 00:06:45.601 14:19:41 rpc -- scripts/common.sh@353 -- # local d=1 00:06:45.601 14:19:41 rpc -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:45.601 14:19:41 rpc -- scripts/common.sh@355 -- # echo 1 00:06:45.601 14:19:41 rpc -- scripts/common.sh@365 -- # ver1[v]=1 00:06:45.601 14:19:41 rpc -- scripts/common.sh@366 -- # decimal 2 00:06:45.601 14:19:41 rpc -- scripts/common.sh@353 -- # local d=2 00:06:45.601 14:19:41 rpc -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:45.601 14:19:41 rpc -- scripts/common.sh@355 -- # echo 2 00:06:45.601 14:19:41 rpc -- scripts/common.sh@366 -- # ver2[v]=2 00:06:45.601 14:19:41 rpc -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:06:45.601 14:19:41 rpc -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:06:45.601 14:19:41 rpc -- scripts/common.sh@368 -- # return 0 00:06:45.601 14:19:41 rpc -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:45.601 14:19:41 rpc -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:06:45.601 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:45.601 --rc genhtml_branch_coverage=1 00:06:45.601 --rc genhtml_function_coverage=1 00:06:45.601 --rc genhtml_legend=1 00:06:45.601 --rc geninfo_all_blocks=1 00:06:45.601 --rc geninfo_unexecuted_blocks=1 00:06:45.601 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:45.601 ' 00:06:45.601 14:19:41 rpc -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:06:45.601 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:45.601 --rc genhtml_branch_coverage=1 00:06:45.601 --rc genhtml_function_coverage=1 00:06:45.601 --rc genhtml_legend=1 00:06:45.601 --rc geninfo_all_blocks=1 00:06:45.601 --rc geninfo_unexecuted_blocks=1 00:06:45.601 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:45.601 ' 00:06:45.601 14:19:41 rpc -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:06:45.601 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:45.601 --rc genhtml_branch_coverage=1 00:06:45.601 --rc genhtml_function_coverage=1 00:06:45.601 --rc genhtml_legend=1 00:06:45.601 --rc geninfo_all_blocks=1 00:06:45.601 --rc geninfo_unexecuted_blocks=1 00:06:45.601 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:45.601 ' 00:06:45.601 14:19:41 rpc -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:06:45.601 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:45.601 --rc genhtml_branch_coverage=1 00:06:45.601 --rc genhtml_function_coverage=1 00:06:45.601 --rc genhtml_legend=1 00:06:45.601 --rc geninfo_all_blocks=1 00:06:45.601 --rc geninfo_unexecuted_blocks=1 00:06:45.601 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:45.601 ' 00:06:45.601 14:19:41 rpc -- rpc/rpc.sh@65 -- # spdk_pid=309087 00:06:45.601 14:19:41 rpc -- rpc/rpc.sh@64 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -e bdev 00:06:45.601 14:19:41 rpc -- rpc/rpc.sh@66 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:06:45.601 14:19:41 rpc -- rpc/rpc.sh@67 -- # waitforlisten 309087 00:06:45.601 14:19:41 rpc -- common/autotest_common.sh@835 -- # '[' -z 309087 ']' 00:06:45.601 14:19:41 rpc -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:45.601 14:19:41 rpc -- common/autotest_common.sh@840 -- # local max_retries=100 00:06:45.601 14:19:41 rpc -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:45.601 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:45.601 14:19:41 rpc -- common/autotest_common.sh@844 -- # xtrace_disable 00:06:45.601 14:19:41 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:45.601 [2024-11-18 14:19:41.553274] Starting SPDK v25.01-pre git sha1 d47eb51c9 / DPDK 23.11.0 initialization... 00:06:45.601 [2024-11-18 14:19:41.553363] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid309087 ] 00:06:45.601 [2024-11-18 14:19:41.621079] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:45.601 [2024-11-18 14:19:41.641968] app.c: 612:app_setup_trace: *NOTICE*: Tracepoint Group Mask bdev specified. 00:06:45.601 [2024-11-18 14:19:41.642007] app.c: 616:app_setup_trace: *NOTICE*: Use 'spdk_trace -s spdk_tgt -p 309087' to capture a snapshot of events at runtime. 00:06:45.601 [2024-11-18 14:19:41.642016] app.c: 618:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:06:45.601 [2024-11-18 14:19:41.642024] app.c: 619:app_setup_trace: *NOTICE*: SPDK application currently running. 00:06:45.601 [2024-11-18 14:19:41.642031] app.c: 620:app_setup_trace: *NOTICE*: Or copy /dev/shm/spdk_tgt_trace.pid309087 for offline analysis/debug. 00:06:45.601 [2024-11-18 14:19:41.642632] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:06:45.862 14:19:41 rpc -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:06:45.862 14:19:41 rpc -- common/autotest_common.sh@868 -- # return 0 00:06:45.862 14:19:41 rpc -- rpc/rpc.sh@69 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:06:45.862 14:19:41 rpc -- rpc/rpc.sh@69 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:06:45.862 14:19:41 rpc -- rpc/rpc.sh@72 -- # rpc=rpc_cmd 00:06:45.862 14:19:41 rpc -- rpc/rpc.sh@73 -- # run_test rpc_integrity rpc_integrity 00:06:45.862 14:19:41 rpc -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:45.862 14:19:41 rpc -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:45.862 14:19:41 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:45.862 ************************************ 00:06:45.862 START TEST rpc_integrity 00:06:45.862 ************************************ 00:06:45.862 14:19:41 rpc.rpc_integrity -- common/autotest_common.sh@1129 -- # rpc_integrity 00:06:45.862 14:19:41 rpc.rpc_integrity -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:06:45.862 14:19:41 rpc.rpc_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:45.862 14:19:41 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:45.862 14:19:41 rpc.rpc_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:45.862 14:19:41 rpc.rpc_integrity -- rpc/rpc.sh@12 -- # bdevs='[]' 00:06:45.862 14:19:41 rpc.rpc_integrity -- rpc/rpc.sh@13 -- # jq length 00:06:45.862 14:19:41 rpc.rpc_integrity -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:06:45.862 14:19:41 rpc.rpc_integrity -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:06:45.862 14:19:41 rpc.rpc_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:45.862 14:19:41 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:45.862 14:19:41 rpc.rpc_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:45.862 14:19:41 rpc.rpc_integrity -- rpc/rpc.sh@15 -- # malloc=Malloc0 00:06:45.862 14:19:41 rpc.rpc_integrity -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:06:45.862 14:19:41 rpc.rpc_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:45.862 14:19:41 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:45.862 14:19:41 rpc.rpc_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:45.862 14:19:41 rpc.rpc_integrity -- rpc/rpc.sh@16 -- # bdevs='[ 00:06:45.862 { 00:06:45.862 "name": "Malloc0", 00:06:45.862 "aliases": [ 00:06:45.862 "88a5fac7-bff1-4c13-9d89-c137e6b2cc99" 00:06:45.862 ], 00:06:45.862 "product_name": "Malloc disk", 00:06:45.862 "block_size": 512, 00:06:45.862 "num_blocks": 16384, 00:06:45.862 "uuid": "88a5fac7-bff1-4c13-9d89-c137e6b2cc99", 00:06:45.862 "assigned_rate_limits": { 00:06:45.862 "rw_ios_per_sec": 0, 00:06:45.862 "rw_mbytes_per_sec": 0, 00:06:45.862 "r_mbytes_per_sec": 0, 00:06:45.862 "w_mbytes_per_sec": 0 00:06:45.862 }, 00:06:45.862 "claimed": false, 00:06:45.862 "zoned": false, 00:06:45.862 "supported_io_types": { 00:06:45.862 "read": true, 00:06:45.862 "write": true, 00:06:45.862 "unmap": true, 00:06:45.862 "flush": true, 00:06:45.862 "reset": true, 00:06:45.862 "nvme_admin": false, 00:06:45.862 "nvme_io": false, 00:06:45.862 "nvme_io_md": false, 00:06:45.862 "write_zeroes": true, 00:06:45.862 "zcopy": true, 00:06:45.862 "get_zone_info": false, 00:06:45.862 "zone_management": false, 00:06:45.862 "zone_append": false, 00:06:45.862 "compare": false, 00:06:45.862 "compare_and_write": false, 00:06:45.862 "abort": true, 00:06:45.862 "seek_hole": false, 00:06:45.862 "seek_data": false, 00:06:45.862 "copy": true, 00:06:45.862 "nvme_iov_md": false 00:06:45.862 }, 00:06:45.862 "memory_domains": [ 00:06:45.862 { 00:06:45.862 "dma_device_id": "system", 00:06:45.862 "dma_device_type": 1 00:06:45.862 }, 00:06:45.862 { 00:06:45.862 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:45.862 "dma_device_type": 2 00:06:45.862 } 00:06:45.862 ], 00:06:45.862 "driver_specific": {} 00:06:45.862 } 00:06:45.862 ]' 00:06:45.862 14:19:41 rpc.rpc_integrity -- rpc/rpc.sh@17 -- # jq length 00:06:46.122 14:19:42 rpc.rpc_integrity -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:06:46.122 14:19:42 rpc.rpc_integrity -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc0 -p Passthru0 00:06:46.122 14:19:42 rpc.rpc_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:46.122 14:19:42 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:46.122 [2024-11-18 14:19:42.031511] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc0 00:06:46.122 [2024-11-18 14:19:42.031544] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:06:46.122 [2024-11-18 14:19:42.031565] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x59392e0 00:06:46.122 [2024-11-18 14:19:42.031575] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:06:46.122 [2024-11-18 14:19:42.032384] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:06:46.122 [2024-11-18 14:19:42.032408] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:06:46.122 Passthru0 00:06:46.123 14:19:42 rpc.rpc_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:46.123 14:19:42 rpc.rpc_integrity -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:06:46.123 14:19:42 rpc.rpc_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:46.123 14:19:42 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:46.123 14:19:42 rpc.rpc_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:46.123 14:19:42 rpc.rpc_integrity -- rpc/rpc.sh@20 -- # bdevs='[ 00:06:46.123 { 00:06:46.123 "name": "Malloc0", 00:06:46.123 "aliases": [ 00:06:46.123 "88a5fac7-bff1-4c13-9d89-c137e6b2cc99" 00:06:46.123 ], 00:06:46.123 "product_name": "Malloc disk", 00:06:46.123 "block_size": 512, 00:06:46.123 "num_blocks": 16384, 00:06:46.123 "uuid": "88a5fac7-bff1-4c13-9d89-c137e6b2cc99", 00:06:46.123 "assigned_rate_limits": { 00:06:46.123 "rw_ios_per_sec": 0, 00:06:46.123 "rw_mbytes_per_sec": 0, 00:06:46.123 "r_mbytes_per_sec": 0, 00:06:46.123 "w_mbytes_per_sec": 0 00:06:46.123 }, 00:06:46.123 "claimed": true, 00:06:46.123 "claim_type": "exclusive_write", 00:06:46.123 "zoned": false, 00:06:46.123 "supported_io_types": { 00:06:46.123 "read": true, 00:06:46.123 "write": true, 00:06:46.123 "unmap": true, 00:06:46.123 "flush": true, 00:06:46.123 "reset": true, 00:06:46.123 "nvme_admin": false, 00:06:46.123 "nvme_io": false, 00:06:46.123 "nvme_io_md": false, 00:06:46.123 "write_zeroes": true, 00:06:46.123 "zcopy": true, 00:06:46.123 "get_zone_info": false, 00:06:46.123 "zone_management": false, 00:06:46.123 "zone_append": false, 00:06:46.123 "compare": false, 00:06:46.123 "compare_and_write": false, 00:06:46.123 "abort": true, 00:06:46.123 "seek_hole": false, 00:06:46.123 "seek_data": false, 00:06:46.123 "copy": true, 00:06:46.123 "nvme_iov_md": false 00:06:46.123 }, 00:06:46.123 "memory_domains": [ 00:06:46.123 { 00:06:46.123 "dma_device_id": "system", 00:06:46.123 "dma_device_type": 1 00:06:46.123 }, 00:06:46.123 { 00:06:46.123 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:46.123 "dma_device_type": 2 00:06:46.123 } 00:06:46.123 ], 00:06:46.123 "driver_specific": {} 00:06:46.123 }, 00:06:46.123 { 00:06:46.123 "name": "Passthru0", 00:06:46.123 "aliases": [ 00:06:46.123 "8b522b1f-f201-53ba-8feb-c8650f9afbac" 00:06:46.123 ], 00:06:46.123 "product_name": "passthru", 00:06:46.123 "block_size": 512, 00:06:46.123 "num_blocks": 16384, 00:06:46.123 "uuid": "8b522b1f-f201-53ba-8feb-c8650f9afbac", 00:06:46.123 "assigned_rate_limits": { 00:06:46.123 "rw_ios_per_sec": 0, 00:06:46.123 "rw_mbytes_per_sec": 0, 00:06:46.123 "r_mbytes_per_sec": 0, 00:06:46.123 "w_mbytes_per_sec": 0 00:06:46.123 }, 00:06:46.123 "claimed": false, 00:06:46.123 "zoned": false, 00:06:46.123 "supported_io_types": { 00:06:46.123 "read": true, 00:06:46.123 "write": true, 00:06:46.123 "unmap": true, 00:06:46.123 "flush": true, 00:06:46.123 "reset": true, 00:06:46.123 "nvme_admin": false, 00:06:46.123 "nvme_io": false, 00:06:46.123 "nvme_io_md": false, 00:06:46.123 "write_zeroes": true, 00:06:46.123 "zcopy": true, 00:06:46.123 "get_zone_info": false, 00:06:46.123 "zone_management": false, 00:06:46.123 "zone_append": false, 00:06:46.123 "compare": false, 00:06:46.123 "compare_and_write": false, 00:06:46.123 "abort": true, 00:06:46.123 "seek_hole": false, 00:06:46.123 "seek_data": false, 00:06:46.123 "copy": true, 00:06:46.123 "nvme_iov_md": false 00:06:46.123 }, 00:06:46.123 "memory_domains": [ 00:06:46.123 { 00:06:46.123 "dma_device_id": "system", 00:06:46.123 "dma_device_type": 1 00:06:46.123 }, 00:06:46.123 { 00:06:46.123 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:46.123 "dma_device_type": 2 00:06:46.123 } 00:06:46.123 ], 00:06:46.123 "driver_specific": { 00:06:46.123 "passthru": { 00:06:46.123 "name": "Passthru0", 00:06:46.123 "base_bdev_name": "Malloc0" 00:06:46.123 } 00:06:46.123 } 00:06:46.123 } 00:06:46.123 ]' 00:06:46.123 14:19:42 rpc.rpc_integrity -- rpc/rpc.sh@21 -- # jq length 00:06:46.123 14:19:42 rpc.rpc_integrity -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:06:46.123 14:19:42 rpc.rpc_integrity -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:06:46.123 14:19:42 rpc.rpc_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:46.123 14:19:42 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:46.123 14:19:42 rpc.rpc_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:46.123 14:19:42 rpc.rpc_integrity -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc0 00:06:46.123 14:19:42 rpc.rpc_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:46.123 14:19:42 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:46.123 14:19:42 rpc.rpc_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:46.123 14:19:42 rpc.rpc_integrity -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:06:46.123 14:19:42 rpc.rpc_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:46.123 14:19:42 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:46.123 14:19:42 rpc.rpc_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:46.123 14:19:42 rpc.rpc_integrity -- rpc/rpc.sh@25 -- # bdevs='[]' 00:06:46.123 14:19:42 rpc.rpc_integrity -- rpc/rpc.sh@26 -- # jq length 00:06:46.123 14:19:42 rpc.rpc_integrity -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:06:46.123 00:06:46.123 real 0m0.287s 00:06:46.123 user 0m0.176s 00:06:46.123 sys 0m0.049s 00:06:46.123 14:19:42 rpc.rpc_integrity -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:46.123 14:19:42 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:46.123 ************************************ 00:06:46.123 END TEST rpc_integrity 00:06:46.123 ************************************ 00:06:46.123 14:19:42 rpc -- rpc/rpc.sh@74 -- # run_test rpc_plugins rpc_plugins 00:06:46.123 14:19:42 rpc -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:46.123 14:19:42 rpc -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:46.123 14:19:42 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:46.384 ************************************ 00:06:46.384 START TEST rpc_plugins 00:06:46.384 ************************************ 00:06:46.384 14:19:42 rpc.rpc_plugins -- common/autotest_common.sh@1129 -- # rpc_plugins 00:06:46.384 14:19:42 rpc.rpc_plugins -- rpc/rpc.sh@30 -- # rpc_cmd --plugin rpc_plugin create_malloc 00:06:46.384 14:19:42 rpc.rpc_plugins -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:46.384 14:19:42 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:46.384 14:19:42 rpc.rpc_plugins -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:46.384 14:19:42 rpc.rpc_plugins -- rpc/rpc.sh@30 -- # malloc=Malloc1 00:06:46.384 14:19:42 rpc.rpc_plugins -- rpc/rpc.sh@31 -- # rpc_cmd bdev_get_bdevs 00:06:46.384 14:19:42 rpc.rpc_plugins -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:46.384 14:19:42 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:46.384 14:19:42 rpc.rpc_plugins -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:46.384 14:19:42 rpc.rpc_plugins -- rpc/rpc.sh@31 -- # bdevs='[ 00:06:46.384 { 00:06:46.384 "name": "Malloc1", 00:06:46.384 "aliases": [ 00:06:46.384 "c36d0e1f-88f7-41d1-a354-b134e4d92d3b" 00:06:46.384 ], 00:06:46.384 "product_name": "Malloc disk", 00:06:46.384 "block_size": 4096, 00:06:46.384 "num_blocks": 256, 00:06:46.384 "uuid": "c36d0e1f-88f7-41d1-a354-b134e4d92d3b", 00:06:46.384 "assigned_rate_limits": { 00:06:46.384 "rw_ios_per_sec": 0, 00:06:46.384 "rw_mbytes_per_sec": 0, 00:06:46.384 "r_mbytes_per_sec": 0, 00:06:46.384 "w_mbytes_per_sec": 0 00:06:46.384 }, 00:06:46.384 "claimed": false, 00:06:46.384 "zoned": false, 00:06:46.384 "supported_io_types": { 00:06:46.384 "read": true, 00:06:46.384 "write": true, 00:06:46.384 "unmap": true, 00:06:46.384 "flush": true, 00:06:46.384 "reset": true, 00:06:46.384 "nvme_admin": false, 00:06:46.384 "nvme_io": false, 00:06:46.384 "nvme_io_md": false, 00:06:46.384 "write_zeroes": true, 00:06:46.384 "zcopy": true, 00:06:46.384 "get_zone_info": false, 00:06:46.384 "zone_management": false, 00:06:46.384 "zone_append": false, 00:06:46.384 "compare": false, 00:06:46.384 "compare_and_write": false, 00:06:46.384 "abort": true, 00:06:46.384 "seek_hole": false, 00:06:46.384 "seek_data": false, 00:06:46.384 "copy": true, 00:06:46.384 "nvme_iov_md": false 00:06:46.384 }, 00:06:46.384 "memory_domains": [ 00:06:46.384 { 00:06:46.384 "dma_device_id": "system", 00:06:46.384 "dma_device_type": 1 00:06:46.384 }, 00:06:46.384 { 00:06:46.384 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:46.384 "dma_device_type": 2 00:06:46.384 } 00:06:46.384 ], 00:06:46.384 "driver_specific": {} 00:06:46.384 } 00:06:46.384 ]' 00:06:46.384 14:19:42 rpc.rpc_plugins -- rpc/rpc.sh@32 -- # jq length 00:06:46.384 14:19:42 rpc.rpc_plugins -- rpc/rpc.sh@32 -- # '[' 1 == 1 ']' 00:06:46.384 14:19:42 rpc.rpc_plugins -- rpc/rpc.sh@34 -- # rpc_cmd --plugin rpc_plugin delete_malloc Malloc1 00:06:46.384 14:19:42 rpc.rpc_plugins -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:46.384 14:19:42 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:46.384 14:19:42 rpc.rpc_plugins -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:46.384 14:19:42 rpc.rpc_plugins -- rpc/rpc.sh@35 -- # rpc_cmd bdev_get_bdevs 00:06:46.384 14:19:42 rpc.rpc_plugins -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:46.384 14:19:42 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:46.384 14:19:42 rpc.rpc_plugins -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:46.384 14:19:42 rpc.rpc_plugins -- rpc/rpc.sh@35 -- # bdevs='[]' 00:06:46.384 14:19:42 rpc.rpc_plugins -- rpc/rpc.sh@36 -- # jq length 00:06:46.384 14:19:42 rpc.rpc_plugins -- rpc/rpc.sh@36 -- # '[' 0 == 0 ']' 00:06:46.384 00:06:46.384 real 0m0.150s 00:06:46.384 user 0m0.090s 00:06:46.384 sys 0m0.026s 00:06:46.384 14:19:42 rpc.rpc_plugins -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:46.384 14:19:42 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:46.384 ************************************ 00:06:46.384 END TEST rpc_plugins 00:06:46.384 ************************************ 00:06:46.384 14:19:42 rpc -- rpc/rpc.sh@75 -- # run_test rpc_trace_cmd_test rpc_trace_cmd_test 00:06:46.384 14:19:42 rpc -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:46.384 14:19:42 rpc -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:46.384 14:19:42 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:46.384 ************************************ 00:06:46.384 START TEST rpc_trace_cmd_test 00:06:46.384 ************************************ 00:06:46.384 14:19:42 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@1129 -- # rpc_trace_cmd_test 00:06:46.384 14:19:42 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@40 -- # local info 00:06:46.384 14:19:42 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@42 -- # rpc_cmd trace_get_info 00:06:46.384 14:19:42 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:46.384 14:19:42 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@10 -- # set +x 00:06:46.384 14:19:42 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:46.384 14:19:42 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@42 -- # info='{ 00:06:46.384 "tpoint_shm_path": "/dev/shm/spdk_tgt_trace.pid309087", 00:06:46.384 "tpoint_group_mask": "0x8", 00:06:46.384 "iscsi_conn": { 00:06:46.384 "mask": "0x2", 00:06:46.384 "tpoint_mask": "0x0" 00:06:46.384 }, 00:06:46.384 "scsi": { 00:06:46.384 "mask": "0x4", 00:06:46.384 "tpoint_mask": "0x0" 00:06:46.384 }, 00:06:46.384 "bdev": { 00:06:46.384 "mask": "0x8", 00:06:46.384 "tpoint_mask": "0xffffffffffffffff" 00:06:46.384 }, 00:06:46.384 "nvmf_rdma": { 00:06:46.384 "mask": "0x10", 00:06:46.384 "tpoint_mask": "0x0" 00:06:46.384 }, 00:06:46.384 "nvmf_tcp": { 00:06:46.384 "mask": "0x20", 00:06:46.384 "tpoint_mask": "0x0" 00:06:46.384 }, 00:06:46.384 "ftl": { 00:06:46.384 "mask": "0x40", 00:06:46.384 "tpoint_mask": "0x0" 00:06:46.384 }, 00:06:46.384 "blobfs": { 00:06:46.384 "mask": "0x80", 00:06:46.384 "tpoint_mask": "0x0" 00:06:46.384 }, 00:06:46.384 "dsa": { 00:06:46.384 "mask": "0x200", 00:06:46.384 "tpoint_mask": "0x0" 00:06:46.384 }, 00:06:46.384 "thread": { 00:06:46.384 "mask": "0x400", 00:06:46.384 "tpoint_mask": "0x0" 00:06:46.384 }, 00:06:46.384 "nvme_pcie": { 00:06:46.385 "mask": "0x800", 00:06:46.385 "tpoint_mask": "0x0" 00:06:46.385 }, 00:06:46.385 "iaa": { 00:06:46.385 "mask": "0x1000", 00:06:46.385 "tpoint_mask": "0x0" 00:06:46.385 }, 00:06:46.385 "nvme_tcp": { 00:06:46.385 "mask": "0x2000", 00:06:46.385 "tpoint_mask": "0x0" 00:06:46.385 }, 00:06:46.385 "bdev_nvme": { 00:06:46.385 "mask": "0x4000", 00:06:46.385 "tpoint_mask": "0x0" 00:06:46.385 }, 00:06:46.385 "sock": { 00:06:46.385 "mask": "0x8000", 00:06:46.385 "tpoint_mask": "0x0" 00:06:46.385 }, 00:06:46.385 "blob": { 00:06:46.385 "mask": "0x10000", 00:06:46.385 "tpoint_mask": "0x0" 00:06:46.385 }, 00:06:46.385 "bdev_raid": { 00:06:46.385 "mask": "0x20000", 00:06:46.385 "tpoint_mask": "0x0" 00:06:46.385 }, 00:06:46.385 "scheduler": { 00:06:46.385 "mask": "0x40000", 00:06:46.385 "tpoint_mask": "0x0" 00:06:46.385 } 00:06:46.385 }' 00:06:46.644 14:19:42 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@43 -- # jq length 00:06:46.644 14:19:42 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@43 -- # '[' 19 -gt 2 ']' 00:06:46.644 14:19:42 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@44 -- # jq 'has("tpoint_group_mask")' 00:06:46.644 14:19:42 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@44 -- # '[' true = true ']' 00:06:46.644 14:19:42 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@45 -- # jq 'has("tpoint_shm_path")' 00:06:46.644 14:19:42 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@45 -- # '[' true = true ']' 00:06:46.644 14:19:42 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@46 -- # jq 'has("bdev")' 00:06:46.644 14:19:42 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@46 -- # '[' true = true ']' 00:06:46.644 14:19:42 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@47 -- # jq -r .bdev.tpoint_mask 00:06:46.645 14:19:42 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@47 -- # '[' 0xffffffffffffffff '!=' 0x0 ']' 00:06:46.645 00:06:46.645 real 0m0.244s 00:06:46.645 user 0m0.198s 00:06:46.645 sys 0m0.039s 00:06:46.645 14:19:42 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:46.645 14:19:42 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@10 -- # set +x 00:06:46.645 ************************************ 00:06:46.645 END TEST rpc_trace_cmd_test 00:06:46.645 ************************************ 00:06:46.645 14:19:42 rpc -- rpc/rpc.sh@76 -- # [[ 0 -eq 1 ]] 00:06:46.645 14:19:42 rpc -- rpc/rpc.sh@80 -- # rpc=rpc_cmd 00:06:46.645 14:19:42 rpc -- rpc/rpc.sh@81 -- # run_test rpc_daemon_integrity rpc_integrity 00:06:46.645 14:19:42 rpc -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:46.645 14:19:42 rpc -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:46.645 14:19:42 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:46.905 ************************************ 00:06:46.905 START TEST rpc_daemon_integrity 00:06:46.905 ************************************ 00:06:46.905 14:19:42 rpc.rpc_daemon_integrity -- common/autotest_common.sh@1129 -- # rpc_integrity 00:06:46.905 14:19:42 rpc.rpc_daemon_integrity -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:06:46.905 14:19:42 rpc.rpc_daemon_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:46.905 14:19:42 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:46.905 14:19:42 rpc.rpc_daemon_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:46.905 14:19:42 rpc.rpc_daemon_integrity -- rpc/rpc.sh@12 -- # bdevs='[]' 00:06:46.905 14:19:42 rpc.rpc_daemon_integrity -- rpc/rpc.sh@13 -- # jq length 00:06:46.905 14:19:42 rpc.rpc_daemon_integrity -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:06:46.905 14:19:42 rpc.rpc_daemon_integrity -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:06:46.905 14:19:42 rpc.rpc_daemon_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:46.905 14:19:42 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:46.905 14:19:42 rpc.rpc_daemon_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:46.905 14:19:42 rpc.rpc_daemon_integrity -- rpc/rpc.sh@15 -- # malloc=Malloc2 00:06:46.905 14:19:42 rpc.rpc_daemon_integrity -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:06:46.905 14:19:42 rpc.rpc_daemon_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:46.905 14:19:42 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:46.905 14:19:42 rpc.rpc_daemon_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:46.905 14:19:42 rpc.rpc_daemon_integrity -- rpc/rpc.sh@16 -- # bdevs='[ 00:06:46.905 { 00:06:46.905 "name": "Malloc2", 00:06:46.905 "aliases": [ 00:06:46.905 "33cea8e8-e3db-4ad8-a673-9759e5e36462" 00:06:46.905 ], 00:06:46.905 "product_name": "Malloc disk", 00:06:46.905 "block_size": 512, 00:06:46.905 "num_blocks": 16384, 00:06:46.905 "uuid": "33cea8e8-e3db-4ad8-a673-9759e5e36462", 00:06:46.905 "assigned_rate_limits": { 00:06:46.905 "rw_ios_per_sec": 0, 00:06:46.905 "rw_mbytes_per_sec": 0, 00:06:46.905 "r_mbytes_per_sec": 0, 00:06:46.905 "w_mbytes_per_sec": 0 00:06:46.905 }, 00:06:46.905 "claimed": false, 00:06:46.905 "zoned": false, 00:06:46.905 "supported_io_types": { 00:06:46.905 "read": true, 00:06:46.905 "write": true, 00:06:46.905 "unmap": true, 00:06:46.905 "flush": true, 00:06:46.905 "reset": true, 00:06:46.905 "nvme_admin": false, 00:06:46.905 "nvme_io": false, 00:06:46.905 "nvme_io_md": false, 00:06:46.905 "write_zeroes": true, 00:06:46.905 "zcopy": true, 00:06:46.905 "get_zone_info": false, 00:06:46.905 "zone_management": false, 00:06:46.905 "zone_append": false, 00:06:46.905 "compare": false, 00:06:46.905 "compare_and_write": false, 00:06:46.905 "abort": true, 00:06:46.905 "seek_hole": false, 00:06:46.905 "seek_data": false, 00:06:46.905 "copy": true, 00:06:46.905 "nvme_iov_md": false 00:06:46.905 }, 00:06:46.905 "memory_domains": [ 00:06:46.905 { 00:06:46.905 "dma_device_id": "system", 00:06:46.905 "dma_device_type": 1 00:06:46.905 }, 00:06:46.905 { 00:06:46.905 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:46.905 "dma_device_type": 2 00:06:46.905 } 00:06:46.905 ], 00:06:46.905 "driver_specific": {} 00:06:46.905 } 00:06:46.905 ]' 00:06:46.905 14:19:42 rpc.rpc_daemon_integrity -- rpc/rpc.sh@17 -- # jq length 00:06:46.905 14:19:42 rpc.rpc_daemon_integrity -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:06:46.905 14:19:42 rpc.rpc_daemon_integrity -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc2 -p Passthru0 00:06:46.905 14:19:42 rpc.rpc_daemon_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:46.905 14:19:42 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:46.906 [2024-11-18 14:19:42.957931] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc2 00:06:46.906 [2024-11-18 14:19:42.957962] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:06:46.906 [2024-11-18 14:19:42.957979] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x5a68420 00:06:46.906 [2024-11-18 14:19:42.957989] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:06:46.906 [2024-11-18 14:19:42.958753] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:06:46.906 [2024-11-18 14:19:42.958777] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:06:46.906 Passthru0 00:06:46.906 14:19:42 rpc.rpc_daemon_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:46.906 14:19:42 rpc.rpc_daemon_integrity -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:06:46.906 14:19:42 rpc.rpc_daemon_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:46.906 14:19:42 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:46.906 14:19:42 rpc.rpc_daemon_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:46.906 14:19:42 rpc.rpc_daemon_integrity -- rpc/rpc.sh@20 -- # bdevs='[ 00:06:46.906 { 00:06:46.906 "name": "Malloc2", 00:06:46.906 "aliases": [ 00:06:46.906 "33cea8e8-e3db-4ad8-a673-9759e5e36462" 00:06:46.906 ], 00:06:46.906 "product_name": "Malloc disk", 00:06:46.906 "block_size": 512, 00:06:46.906 "num_blocks": 16384, 00:06:46.906 "uuid": "33cea8e8-e3db-4ad8-a673-9759e5e36462", 00:06:46.906 "assigned_rate_limits": { 00:06:46.906 "rw_ios_per_sec": 0, 00:06:46.906 "rw_mbytes_per_sec": 0, 00:06:46.906 "r_mbytes_per_sec": 0, 00:06:46.906 "w_mbytes_per_sec": 0 00:06:46.906 }, 00:06:46.906 "claimed": true, 00:06:46.906 "claim_type": "exclusive_write", 00:06:46.906 "zoned": false, 00:06:46.906 "supported_io_types": { 00:06:46.906 "read": true, 00:06:46.906 "write": true, 00:06:46.906 "unmap": true, 00:06:46.906 "flush": true, 00:06:46.906 "reset": true, 00:06:46.906 "nvme_admin": false, 00:06:46.906 "nvme_io": false, 00:06:46.906 "nvme_io_md": false, 00:06:46.906 "write_zeroes": true, 00:06:46.906 "zcopy": true, 00:06:46.906 "get_zone_info": false, 00:06:46.906 "zone_management": false, 00:06:46.906 "zone_append": false, 00:06:46.906 "compare": false, 00:06:46.906 "compare_and_write": false, 00:06:46.906 "abort": true, 00:06:46.906 "seek_hole": false, 00:06:46.906 "seek_data": false, 00:06:46.906 "copy": true, 00:06:46.906 "nvme_iov_md": false 00:06:46.906 }, 00:06:46.906 "memory_domains": [ 00:06:46.906 { 00:06:46.906 "dma_device_id": "system", 00:06:46.906 "dma_device_type": 1 00:06:46.906 }, 00:06:46.906 { 00:06:46.906 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:46.906 "dma_device_type": 2 00:06:46.906 } 00:06:46.906 ], 00:06:46.906 "driver_specific": {} 00:06:46.906 }, 00:06:46.906 { 00:06:46.906 "name": "Passthru0", 00:06:46.906 "aliases": [ 00:06:46.906 "815bd450-e821-527d-b8b2-18183da5f467" 00:06:46.906 ], 00:06:46.906 "product_name": "passthru", 00:06:46.906 "block_size": 512, 00:06:46.906 "num_blocks": 16384, 00:06:46.906 "uuid": "815bd450-e821-527d-b8b2-18183da5f467", 00:06:46.906 "assigned_rate_limits": { 00:06:46.906 "rw_ios_per_sec": 0, 00:06:46.906 "rw_mbytes_per_sec": 0, 00:06:46.906 "r_mbytes_per_sec": 0, 00:06:46.906 "w_mbytes_per_sec": 0 00:06:46.906 }, 00:06:46.906 "claimed": false, 00:06:46.906 "zoned": false, 00:06:46.906 "supported_io_types": { 00:06:46.906 "read": true, 00:06:46.906 "write": true, 00:06:46.906 "unmap": true, 00:06:46.906 "flush": true, 00:06:46.906 "reset": true, 00:06:46.906 "nvme_admin": false, 00:06:46.906 "nvme_io": false, 00:06:46.906 "nvme_io_md": false, 00:06:46.906 "write_zeroes": true, 00:06:46.906 "zcopy": true, 00:06:46.906 "get_zone_info": false, 00:06:46.906 "zone_management": false, 00:06:46.906 "zone_append": false, 00:06:46.906 "compare": false, 00:06:46.906 "compare_and_write": false, 00:06:46.906 "abort": true, 00:06:46.906 "seek_hole": false, 00:06:46.906 "seek_data": false, 00:06:46.906 "copy": true, 00:06:46.906 "nvme_iov_md": false 00:06:46.906 }, 00:06:46.906 "memory_domains": [ 00:06:46.906 { 00:06:46.906 "dma_device_id": "system", 00:06:46.906 "dma_device_type": 1 00:06:46.906 }, 00:06:46.906 { 00:06:46.906 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:46.906 "dma_device_type": 2 00:06:46.906 } 00:06:46.906 ], 00:06:46.906 "driver_specific": { 00:06:46.906 "passthru": { 00:06:46.906 "name": "Passthru0", 00:06:46.906 "base_bdev_name": "Malloc2" 00:06:46.906 } 00:06:46.906 } 00:06:46.906 } 00:06:46.906 ]' 00:06:46.906 14:19:42 rpc.rpc_daemon_integrity -- rpc/rpc.sh@21 -- # jq length 00:06:47.167 14:19:43 rpc.rpc_daemon_integrity -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:06:47.167 14:19:43 rpc.rpc_daemon_integrity -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:06:47.167 14:19:43 rpc.rpc_daemon_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:47.167 14:19:43 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:47.167 14:19:43 rpc.rpc_daemon_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:47.167 14:19:43 rpc.rpc_daemon_integrity -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc2 00:06:47.167 14:19:43 rpc.rpc_daemon_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:47.167 14:19:43 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:47.167 14:19:43 rpc.rpc_daemon_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:47.167 14:19:43 rpc.rpc_daemon_integrity -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:06:47.167 14:19:43 rpc.rpc_daemon_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:47.167 14:19:43 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:47.167 14:19:43 rpc.rpc_daemon_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:47.167 14:19:43 rpc.rpc_daemon_integrity -- rpc/rpc.sh@25 -- # bdevs='[]' 00:06:47.167 14:19:43 rpc.rpc_daemon_integrity -- rpc/rpc.sh@26 -- # jq length 00:06:47.167 14:19:43 rpc.rpc_daemon_integrity -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:06:47.167 00:06:47.167 real 0m0.298s 00:06:47.167 user 0m0.185s 00:06:47.167 sys 0m0.046s 00:06:47.167 14:19:43 rpc.rpc_daemon_integrity -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:47.167 14:19:43 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:47.167 ************************************ 00:06:47.167 END TEST rpc_daemon_integrity 00:06:47.167 ************************************ 00:06:47.167 14:19:43 rpc -- rpc/rpc.sh@83 -- # trap - SIGINT SIGTERM EXIT 00:06:47.167 14:19:43 rpc -- rpc/rpc.sh@84 -- # killprocess 309087 00:06:47.167 14:19:43 rpc -- common/autotest_common.sh@954 -- # '[' -z 309087 ']' 00:06:47.167 14:19:43 rpc -- common/autotest_common.sh@958 -- # kill -0 309087 00:06:47.167 14:19:43 rpc -- common/autotest_common.sh@959 -- # uname 00:06:47.167 14:19:43 rpc -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:06:47.167 14:19:43 rpc -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 309087 00:06:47.167 14:19:43 rpc -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:06:47.167 14:19:43 rpc -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:06:47.167 14:19:43 rpc -- common/autotest_common.sh@972 -- # echo 'killing process with pid 309087' 00:06:47.167 killing process with pid 309087 00:06:47.167 14:19:43 rpc -- common/autotest_common.sh@973 -- # kill 309087 00:06:47.167 14:19:43 rpc -- common/autotest_common.sh@978 -- # wait 309087 00:06:47.427 00:06:47.427 real 0m2.159s 00:06:47.427 user 0m2.758s 00:06:47.427 sys 0m0.814s 00:06:47.427 14:19:43 rpc -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:47.427 14:19:43 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:47.427 ************************************ 00:06:47.427 END TEST rpc 00:06:47.427 ************************************ 00:06:47.428 14:19:43 -- spdk/autotest.sh@157 -- # run_test skip_rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/skip_rpc.sh 00:06:47.428 14:19:43 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:47.428 14:19:43 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:47.428 14:19:43 -- common/autotest_common.sh@10 -- # set +x 00:06:47.688 ************************************ 00:06:47.688 START TEST skip_rpc 00:06:47.688 ************************************ 00:06:47.688 14:19:43 skip_rpc -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/skip_rpc.sh 00:06:47.688 * Looking for test storage... 00:06:47.688 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:06:47.688 14:19:43 skip_rpc -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:06:47.688 14:19:43 skip_rpc -- common/autotest_common.sh@1693 -- # lcov --version 00:06:47.688 14:19:43 skip_rpc -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:06:47.688 14:19:43 skip_rpc -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:06:47.688 14:19:43 skip_rpc -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:06:47.688 14:19:43 skip_rpc -- scripts/common.sh@333 -- # local ver1 ver1_l 00:06:47.688 14:19:43 skip_rpc -- scripts/common.sh@334 -- # local ver2 ver2_l 00:06:47.688 14:19:43 skip_rpc -- scripts/common.sh@336 -- # IFS=.-: 00:06:47.688 14:19:43 skip_rpc -- scripts/common.sh@336 -- # read -ra ver1 00:06:47.688 14:19:43 skip_rpc -- scripts/common.sh@337 -- # IFS=.-: 00:06:47.688 14:19:43 skip_rpc -- scripts/common.sh@337 -- # read -ra ver2 00:06:47.688 14:19:43 skip_rpc -- scripts/common.sh@338 -- # local 'op=<' 00:06:47.688 14:19:43 skip_rpc -- scripts/common.sh@340 -- # ver1_l=2 00:06:47.688 14:19:43 skip_rpc -- scripts/common.sh@341 -- # ver2_l=1 00:06:47.688 14:19:43 skip_rpc -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:06:47.688 14:19:43 skip_rpc -- scripts/common.sh@344 -- # case "$op" in 00:06:47.688 14:19:43 skip_rpc -- scripts/common.sh@345 -- # : 1 00:06:47.688 14:19:43 skip_rpc -- scripts/common.sh@364 -- # (( v = 0 )) 00:06:47.688 14:19:43 skip_rpc -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:47.688 14:19:43 skip_rpc -- scripts/common.sh@365 -- # decimal 1 00:06:47.688 14:19:43 skip_rpc -- scripts/common.sh@353 -- # local d=1 00:06:47.688 14:19:43 skip_rpc -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:47.688 14:19:43 skip_rpc -- scripts/common.sh@355 -- # echo 1 00:06:47.688 14:19:43 skip_rpc -- scripts/common.sh@365 -- # ver1[v]=1 00:06:47.688 14:19:43 skip_rpc -- scripts/common.sh@366 -- # decimal 2 00:06:47.688 14:19:43 skip_rpc -- scripts/common.sh@353 -- # local d=2 00:06:47.688 14:19:43 skip_rpc -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:47.688 14:19:43 skip_rpc -- scripts/common.sh@355 -- # echo 2 00:06:47.688 14:19:43 skip_rpc -- scripts/common.sh@366 -- # ver2[v]=2 00:06:47.688 14:19:43 skip_rpc -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:06:47.688 14:19:43 skip_rpc -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:06:47.688 14:19:43 skip_rpc -- scripts/common.sh@368 -- # return 0 00:06:47.688 14:19:43 skip_rpc -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:47.688 14:19:43 skip_rpc -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:06:47.688 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:47.688 --rc genhtml_branch_coverage=1 00:06:47.688 --rc genhtml_function_coverage=1 00:06:47.688 --rc genhtml_legend=1 00:06:47.688 --rc geninfo_all_blocks=1 00:06:47.688 --rc geninfo_unexecuted_blocks=1 00:06:47.688 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:47.688 ' 00:06:47.688 14:19:43 skip_rpc -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:06:47.688 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:47.688 --rc genhtml_branch_coverage=1 00:06:47.688 --rc genhtml_function_coverage=1 00:06:47.688 --rc genhtml_legend=1 00:06:47.688 --rc geninfo_all_blocks=1 00:06:47.688 --rc geninfo_unexecuted_blocks=1 00:06:47.688 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:47.688 ' 00:06:47.688 14:19:43 skip_rpc -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:06:47.688 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:47.688 --rc genhtml_branch_coverage=1 00:06:47.688 --rc genhtml_function_coverage=1 00:06:47.688 --rc genhtml_legend=1 00:06:47.688 --rc geninfo_all_blocks=1 00:06:47.688 --rc geninfo_unexecuted_blocks=1 00:06:47.688 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:47.688 ' 00:06:47.688 14:19:43 skip_rpc -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:06:47.688 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:47.688 --rc genhtml_branch_coverage=1 00:06:47.688 --rc genhtml_function_coverage=1 00:06:47.688 --rc genhtml_legend=1 00:06:47.688 --rc geninfo_all_blocks=1 00:06:47.688 --rc geninfo_unexecuted_blocks=1 00:06:47.688 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:47.688 ' 00:06:47.688 14:19:43 skip_rpc -- rpc/skip_rpc.sh@11 -- # CONFIG_PATH=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/config.json 00:06:47.688 14:19:43 skip_rpc -- rpc/skip_rpc.sh@12 -- # LOG_PATH=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/log.txt 00:06:47.688 14:19:43 skip_rpc -- rpc/skip_rpc.sh@73 -- # run_test skip_rpc test_skip_rpc 00:06:47.688 14:19:43 skip_rpc -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:47.688 14:19:43 skip_rpc -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:47.688 14:19:43 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:47.948 ************************************ 00:06:47.948 START TEST skip_rpc 00:06:47.948 ************************************ 00:06:47.948 14:19:43 skip_rpc.skip_rpc -- common/autotest_common.sh@1129 -- # test_skip_rpc 00:06:47.948 14:19:43 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@16 -- # local spdk_pid=309553 00:06:47.948 14:19:43 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 00:06:47.948 14:19:43 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@18 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:06:47.948 14:19:43 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@19 -- # sleep 5 00:06:47.948 [2024-11-18 14:19:43.847037] Starting SPDK v25.01-pre git sha1 d47eb51c9 / DPDK 23.11.0 initialization... 00:06:47.948 [2024-11-18 14:19:43.847094] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid309553 ] 00:06:47.948 [2024-11-18 14:19:43.930805] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:47.948 [2024-11-18 14:19:43.952813] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:06:53.235 14:19:48 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@21 -- # NOT rpc_cmd spdk_get_version 00:06:53.235 14:19:48 skip_rpc.skip_rpc -- common/autotest_common.sh@652 -- # local es=0 00:06:53.235 14:19:48 skip_rpc.skip_rpc -- common/autotest_common.sh@654 -- # valid_exec_arg rpc_cmd spdk_get_version 00:06:53.235 14:19:48 skip_rpc.skip_rpc -- common/autotest_common.sh@640 -- # local arg=rpc_cmd 00:06:53.235 14:19:48 skip_rpc.skip_rpc -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:06:53.235 14:19:48 skip_rpc.skip_rpc -- common/autotest_common.sh@644 -- # type -t rpc_cmd 00:06:53.235 14:19:48 skip_rpc.skip_rpc -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:06:53.235 14:19:48 skip_rpc.skip_rpc -- common/autotest_common.sh@655 -- # rpc_cmd spdk_get_version 00:06:53.235 14:19:48 skip_rpc.skip_rpc -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:53.235 14:19:48 skip_rpc.skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:53.235 14:19:48 skip_rpc.skip_rpc -- common/autotest_common.sh@591 -- # [[ 1 == 0 ]] 00:06:53.235 14:19:48 skip_rpc.skip_rpc -- common/autotest_common.sh@655 -- # es=1 00:06:53.235 14:19:48 skip_rpc.skip_rpc -- common/autotest_common.sh@663 -- # (( es > 128 )) 00:06:53.235 14:19:48 skip_rpc.skip_rpc -- common/autotest_common.sh@674 -- # [[ -n '' ]] 00:06:53.236 14:19:48 skip_rpc.skip_rpc -- common/autotest_common.sh@679 -- # (( !es == 0 )) 00:06:53.236 14:19:48 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@22 -- # trap - SIGINT SIGTERM EXIT 00:06:53.236 14:19:48 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@23 -- # killprocess 309553 00:06:53.236 14:19:48 skip_rpc.skip_rpc -- common/autotest_common.sh@954 -- # '[' -z 309553 ']' 00:06:53.236 14:19:48 skip_rpc.skip_rpc -- common/autotest_common.sh@958 -- # kill -0 309553 00:06:53.236 14:19:48 skip_rpc.skip_rpc -- common/autotest_common.sh@959 -- # uname 00:06:53.236 14:19:48 skip_rpc.skip_rpc -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:06:53.236 14:19:48 skip_rpc.skip_rpc -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 309553 00:06:53.236 14:19:48 skip_rpc.skip_rpc -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:06:53.236 14:19:48 skip_rpc.skip_rpc -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:06:53.236 14:19:48 skip_rpc.skip_rpc -- common/autotest_common.sh@972 -- # echo 'killing process with pid 309553' 00:06:53.236 killing process with pid 309553 00:06:53.236 14:19:48 skip_rpc.skip_rpc -- common/autotest_common.sh@973 -- # kill 309553 00:06:53.236 14:19:48 skip_rpc.skip_rpc -- common/autotest_common.sh@978 -- # wait 309553 00:06:53.236 00:06:53.236 real 0m5.362s 00:06:53.236 user 0m5.092s 00:06:53.236 sys 0m0.313s 00:06:53.236 14:19:49 skip_rpc.skip_rpc -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:53.236 14:19:49 skip_rpc.skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:53.236 ************************************ 00:06:53.236 END TEST skip_rpc 00:06:53.236 ************************************ 00:06:53.236 14:19:49 skip_rpc -- rpc/skip_rpc.sh@74 -- # run_test skip_rpc_with_json test_skip_rpc_with_json 00:06:53.236 14:19:49 skip_rpc -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:53.236 14:19:49 skip_rpc -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:53.236 14:19:49 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:53.236 ************************************ 00:06:53.236 START TEST skip_rpc_with_json 00:06:53.236 ************************************ 00:06:53.236 14:19:49 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@1129 -- # test_skip_rpc_with_json 00:06:53.236 14:19:49 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@44 -- # gen_json_config 00:06:53.236 14:19:49 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@28 -- # local spdk_pid=310609 00:06:53.236 14:19:49 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@30 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:06:53.236 14:19:49 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:53.236 14:19:49 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@31 -- # waitforlisten 310609 00:06:53.236 14:19:49 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@835 -- # '[' -z 310609 ']' 00:06:53.236 14:19:49 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:53.236 14:19:49 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@840 -- # local max_retries=100 00:06:53.236 14:19:49 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:53.236 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:53.236 14:19:49 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@844 -- # xtrace_disable 00:06:53.236 14:19:49 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:53.236 [2024-11-18 14:19:49.296202] Starting SPDK v25.01-pre git sha1 d47eb51c9 / DPDK 23.11.0 initialization... 00:06:53.236 [2024-11-18 14:19:49.296260] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid310609 ] 00:06:53.496 [2024-11-18 14:19:49.381589] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:53.496 [2024-11-18 14:19:49.403954] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:06:53.496 14:19:49 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:06:53.496 14:19:49 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@868 -- # return 0 00:06:53.496 14:19:49 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@34 -- # rpc_cmd nvmf_get_transports --trtype tcp 00:06:53.496 14:19:49 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:53.496 14:19:49 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:53.496 [2024-11-18 14:19:49.600165] nvmf_rpc.c:2703:rpc_nvmf_get_transports: *ERROR*: transport 'tcp' does not exist 00:06:53.496 request: 00:06:53.496 { 00:06:53.496 "trtype": "tcp", 00:06:53.496 "method": "nvmf_get_transports", 00:06:53.496 "req_id": 1 00:06:53.496 } 00:06:53.496 Got JSON-RPC error response 00:06:53.496 response: 00:06:53.496 { 00:06:53.496 "code": -19, 00:06:53.496 "message": "No such device" 00:06:53.496 } 00:06:53.496 14:19:49 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@591 -- # [[ 1 == 0 ]] 00:06:53.496 14:19:49 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@34 -- # rpc_cmd nvmf_create_transport -t tcp 00:06:53.496 14:19:49 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:53.496 14:19:49 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:53.496 [2024-11-18 14:19:49.612257] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:06:53.496 14:19:49 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:53.496 14:19:49 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@36 -- # rpc_cmd save_config 00:06:53.496 14:19:49 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:53.496 14:19:49 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:53.756 14:19:49 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:53.756 14:19:49 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@37 -- # cat /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/config.json 00:06:53.756 { 00:06:53.756 "subsystems": [ 00:06:53.756 { 00:06:53.756 "subsystem": "scheduler", 00:06:53.756 "config": [ 00:06:53.756 { 00:06:53.756 "method": "framework_set_scheduler", 00:06:53.756 "params": { 00:06:53.756 "name": "static" 00:06:53.756 } 00:06:53.756 } 00:06:53.756 ] 00:06:53.756 }, 00:06:53.756 { 00:06:53.756 "subsystem": "vmd", 00:06:53.756 "config": [] 00:06:53.756 }, 00:06:53.756 { 00:06:53.756 "subsystem": "sock", 00:06:53.756 "config": [ 00:06:53.756 { 00:06:53.756 "method": "sock_set_default_impl", 00:06:53.756 "params": { 00:06:53.756 "impl_name": "posix" 00:06:53.756 } 00:06:53.756 }, 00:06:53.756 { 00:06:53.756 "method": "sock_impl_set_options", 00:06:53.756 "params": { 00:06:53.756 "impl_name": "ssl", 00:06:53.756 "recv_buf_size": 4096, 00:06:53.756 "send_buf_size": 4096, 00:06:53.756 "enable_recv_pipe": true, 00:06:53.756 "enable_quickack": false, 00:06:53.756 "enable_placement_id": 0, 00:06:53.756 "enable_zerocopy_send_server": true, 00:06:53.756 "enable_zerocopy_send_client": false, 00:06:53.756 "zerocopy_threshold": 0, 00:06:53.756 "tls_version": 0, 00:06:53.756 "enable_ktls": false 00:06:53.756 } 00:06:53.756 }, 00:06:53.756 { 00:06:53.756 "method": "sock_impl_set_options", 00:06:53.756 "params": { 00:06:53.756 "impl_name": "posix", 00:06:53.756 "recv_buf_size": 2097152, 00:06:53.756 "send_buf_size": 2097152, 00:06:53.756 "enable_recv_pipe": true, 00:06:53.756 "enable_quickack": false, 00:06:53.756 "enable_placement_id": 0, 00:06:53.756 "enable_zerocopy_send_server": true, 00:06:53.756 "enable_zerocopy_send_client": false, 00:06:53.756 "zerocopy_threshold": 0, 00:06:53.756 "tls_version": 0, 00:06:53.756 "enable_ktls": false 00:06:53.756 } 00:06:53.756 } 00:06:53.756 ] 00:06:53.756 }, 00:06:53.756 { 00:06:53.756 "subsystem": "iobuf", 00:06:53.756 "config": [ 00:06:53.756 { 00:06:53.756 "method": "iobuf_set_options", 00:06:53.756 "params": { 00:06:53.756 "small_pool_count": 8192, 00:06:53.756 "large_pool_count": 1024, 00:06:53.756 "small_bufsize": 8192, 00:06:53.756 "large_bufsize": 135168, 00:06:53.756 "enable_numa": false 00:06:53.756 } 00:06:53.756 } 00:06:53.756 ] 00:06:53.756 }, 00:06:53.756 { 00:06:53.756 "subsystem": "keyring", 00:06:53.756 "config": [] 00:06:53.756 }, 00:06:53.756 { 00:06:53.756 "subsystem": "vfio_user_target", 00:06:53.756 "config": null 00:06:53.756 }, 00:06:53.756 { 00:06:53.756 "subsystem": "fsdev", 00:06:53.756 "config": [ 00:06:53.756 { 00:06:53.756 "method": "fsdev_set_opts", 00:06:53.756 "params": { 00:06:53.756 "fsdev_io_pool_size": 65535, 00:06:53.756 "fsdev_io_cache_size": 256 00:06:53.756 } 00:06:53.756 } 00:06:53.756 ] 00:06:53.756 }, 00:06:53.756 { 00:06:53.756 "subsystem": "accel", 00:06:53.756 "config": [ 00:06:53.756 { 00:06:53.756 "method": "accel_set_options", 00:06:53.756 "params": { 00:06:53.756 "small_cache_size": 128, 00:06:53.756 "large_cache_size": 16, 00:06:53.757 "task_count": 2048, 00:06:53.757 "sequence_count": 2048, 00:06:53.757 "buf_count": 2048 00:06:53.757 } 00:06:53.757 } 00:06:53.757 ] 00:06:53.757 }, 00:06:53.757 { 00:06:53.757 "subsystem": "bdev", 00:06:53.757 "config": [ 00:06:53.757 { 00:06:53.757 "method": "bdev_set_options", 00:06:53.757 "params": { 00:06:53.757 "bdev_io_pool_size": 65535, 00:06:53.757 "bdev_io_cache_size": 256, 00:06:53.757 "bdev_auto_examine": true, 00:06:53.757 "iobuf_small_cache_size": 128, 00:06:53.757 "iobuf_large_cache_size": 16 00:06:53.757 } 00:06:53.757 }, 00:06:53.757 { 00:06:53.757 "method": "bdev_raid_set_options", 00:06:53.757 "params": { 00:06:53.757 "process_window_size_kb": 1024, 00:06:53.757 "process_max_bandwidth_mb_sec": 0 00:06:53.757 } 00:06:53.757 }, 00:06:53.757 { 00:06:53.757 "method": "bdev_nvme_set_options", 00:06:53.757 "params": { 00:06:53.757 "action_on_timeout": "none", 00:06:53.757 "timeout_us": 0, 00:06:53.757 "timeout_admin_us": 0, 00:06:53.757 "keep_alive_timeout_ms": 10000, 00:06:53.757 "arbitration_burst": 0, 00:06:53.757 "low_priority_weight": 0, 00:06:53.757 "medium_priority_weight": 0, 00:06:53.757 "high_priority_weight": 0, 00:06:53.757 "nvme_adminq_poll_period_us": 10000, 00:06:53.757 "nvme_ioq_poll_period_us": 0, 00:06:53.757 "io_queue_requests": 0, 00:06:53.757 "delay_cmd_submit": true, 00:06:53.757 "transport_retry_count": 4, 00:06:53.757 "bdev_retry_count": 3, 00:06:53.757 "transport_ack_timeout": 0, 00:06:53.757 "ctrlr_loss_timeout_sec": 0, 00:06:53.757 "reconnect_delay_sec": 0, 00:06:53.757 "fast_io_fail_timeout_sec": 0, 00:06:53.757 "disable_auto_failback": false, 00:06:53.757 "generate_uuids": false, 00:06:53.757 "transport_tos": 0, 00:06:53.757 "nvme_error_stat": false, 00:06:53.757 "rdma_srq_size": 0, 00:06:53.757 "io_path_stat": false, 00:06:53.757 "allow_accel_sequence": false, 00:06:53.757 "rdma_max_cq_size": 0, 00:06:53.757 "rdma_cm_event_timeout_ms": 0, 00:06:53.757 "dhchap_digests": [ 00:06:53.757 "sha256", 00:06:53.757 "sha384", 00:06:53.757 "sha512" 00:06:53.757 ], 00:06:53.757 "dhchap_dhgroups": [ 00:06:53.757 "null", 00:06:53.757 "ffdhe2048", 00:06:53.757 "ffdhe3072", 00:06:53.757 "ffdhe4096", 00:06:53.757 "ffdhe6144", 00:06:53.757 "ffdhe8192" 00:06:53.757 ] 00:06:53.757 } 00:06:53.757 }, 00:06:53.757 { 00:06:53.757 "method": "bdev_nvme_set_hotplug", 00:06:53.757 "params": { 00:06:53.757 "period_us": 100000, 00:06:53.757 "enable": false 00:06:53.757 } 00:06:53.757 }, 00:06:53.757 { 00:06:53.757 "method": "bdev_iscsi_set_options", 00:06:53.757 "params": { 00:06:53.757 "timeout_sec": 30 00:06:53.757 } 00:06:53.757 }, 00:06:53.757 { 00:06:53.757 "method": "bdev_wait_for_examine" 00:06:53.757 } 00:06:53.757 ] 00:06:53.757 }, 00:06:53.757 { 00:06:53.757 "subsystem": "nvmf", 00:06:53.757 "config": [ 00:06:53.757 { 00:06:53.757 "method": "nvmf_set_config", 00:06:53.757 "params": { 00:06:53.757 "discovery_filter": "match_any", 00:06:53.757 "admin_cmd_passthru": { 00:06:53.757 "identify_ctrlr": false 00:06:53.757 }, 00:06:53.757 "dhchap_digests": [ 00:06:53.757 "sha256", 00:06:53.757 "sha384", 00:06:53.757 "sha512" 00:06:53.757 ], 00:06:53.757 "dhchap_dhgroups": [ 00:06:53.757 "null", 00:06:53.757 "ffdhe2048", 00:06:53.757 "ffdhe3072", 00:06:53.757 "ffdhe4096", 00:06:53.757 "ffdhe6144", 00:06:53.757 "ffdhe8192" 00:06:53.757 ] 00:06:53.757 } 00:06:53.757 }, 00:06:53.757 { 00:06:53.757 "method": "nvmf_set_max_subsystems", 00:06:53.757 "params": { 00:06:53.757 "max_subsystems": 1024 00:06:53.757 } 00:06:53.757 }, 00:06:53.757 { 00:06:53.757 "method": "nvmf_set_crdt", 00:06:53.757 "params": { 00:06:53.757 "crdt1": 0, 00:06:53.757 "crdt2": 0, 00:06:53.757 "crdt3": 0 00:06:53.757 } 00:06:53.757 }, 00:06:53.757 { 00:06:53.757 "method": "nvmf_create_transport", 00:06:53.757 "params": { 00:06:53.757 "trtype": "TCP", 00:06:53.757 "max_queue_depth": 128, 00:06:53.757 "max_io_qpairs_per_ctrlr": 127, 00:06:53.757 "in_capsule_data_size": 4096, 00:06:53.757 "max_io_size": 131072, 00:06:53.757 "io_unit_size": 131072, 00:06:53.757 "max_aq_depth": 128, 00:06:53.757 "num_shared_buffers": 511, 00:06:53.757 "buf_cache_size": 4294967295, 00:06:53.757 "dif_insert_or_strip": false, 00:06:53.757 "zcopy": false, 00:06:53.757 "c2h_success": true, 00:06:53.757 "sock_priority": 0, 00:06:53.757 "abort_timeout_sec": 1, 00:06:53.757 "ack_timeout": 0, 00:06:53.757 "data_wr_pool_size": 0 00:06:53.757 } 00:06:53.757 } 00:06:53.757 ] 00:06:53.757 }, 00:06:53.757 { 00:06:53.757 "subsystem": "nbd", 00:06:53.757 "config": [] 00:06:53.757 }, 00:06:53.757 { 00:06:53.757 "subsystem": "ublk", 00:06:53.757 "config": [] 00:06:53.757 }, 00:06:53.757 { 00:06:53.757 "subsystem": "vhost_blk", 00:06:53.757 "config": [] 00:06:53.757 }, 00:06:53.757 { 00:06:53.757 "subsystem": "scsi", 00:06:53.757 "config": null 00:06:53.757 }, 00:06:53.757 { 00:06:53.757 "subsystem": "iscsi", 00:06:53.757 "config": [ 00:06:53.757 { 00:06:53.757 "method": "iscsi_set_options", 00:06:53.757 "params": { 00:06:53.757 "node_base": "iqn.2016-06.io.spdk", 00:06:53.757 "max_sessions": 128, 00:06:53.757 "max_connections_per_session": 2, 00:06:53.757 "max_queue_depth": 64, 00:06:53.757 "default_time2wait": 2, 00:06:53.757 "default_time2retain": 20, 00:06:53.757 "first_burst_length": 8192, 00:06:53.757 "immediate_data": true, 00:06:53.757 "allow_duplicated_isid": false, 00:06:53.757 "error_recovery_level": 0, 00:06:53.757 "nop_timeout": 60, 00:06:53.757 "nop_in_interval": 30, 00:06:53.757 "disable_chap": false, 00:06:53.757 "require_chap": false, 00:06:53.757 "mutual_chap": false, 00:06:53.757 "chap_group": 0, 00:06:53.757 "max_large_datain_per_connection": 64, 00:06:53.757 "max_r2t_per_connection": 4, 00:06:53.757 "pdu_pool_size": 36864, 00:06:53.757 "immediate_data_pool_size": 16384, 00:06:53.757 "data_out_pool_size": 2048 00:06:53.757 } 00:06:53.757 } 00:06:53.757 ] 00:06:53.757 }, 00:06:53.757 { 00:06:53.757 "subsystem": "vhost_scsi", 00:06:53.757 "config": [] 00:06:53.757 } 00:06:53.757 ] 00:06:53.757 } 00:06:53.757 14:19:49 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@39 -- # trap - SIGINT SIGTERM EXIT 00:06:53.757 14:19:49 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@40 -- # killprocess 310609 00:06:53.757 14:19:49 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # '[' -z 310609 ']' 00:06:53.757 14:19:49 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@958 -- # kill -0 310609 00:06:53.757 14:19:49 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@959 -- # uname 00:06:53.757 14:19:49 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:06:53.757 14:19:49 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 310609 00:06:53.757 14:19:49 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:06:53.757 14:19:49 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:06:53.757 14:19:49 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@972 -- # echo 'killing process with pid 310609' 00:06:53.757 killing process with pid 310609 00:06:53.757 14:19:49 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@973 -- # kill 310609 00:06:53.757 14:19:49 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@978 -- # wait 310609 00:06:54.328 14:19:50 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@47 -- # local spdk_pid=310657 00:06:54.328 14:19:50 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@46 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --json /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/config.json 00:06:54.328 14:19:50 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@48 -- # sleep 5 00:06:59.613 14:19:55 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@50 -- # killprocess 310657 00:06:59.613 14:19:55 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # '[' -z 310657 ']' 00:06:59.613 14:19:55 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@958 -- # kill -0 310657 00:06:59.613 14:19:55 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@959 -- # uname 00:06:59.613 14:19:55 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:06:59.613 14:19:55 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 310657 00:06:59.613 14:19:55 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:06:59.613 14:19:55 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:06:59.613 14:19:55 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@972 -- # echo 'killing process with pid 310657' 00:06:59.613 killing process with pid 310657 00:06:59.613 14:19:55 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@973 -- # kill 310657 00:06:59.613 14:19:55 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@978 -- # wait 310657 00:06:59.613 14:19:55 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@51 -- # grep -q 'TCP Transport Init' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/log.txt 00:06:59.613 14:19:55 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@52 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/log.txt 00:06:59.613 00:06:59.613 real 0m6.237s 00:06:59.613 user 0m5.910s 00:06:59.613 sys 0m0.669s 00:06:59.613 14:19:55 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:59.613 14:19:55 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:59.613 ************************************ 00:06:59.613 END TEST skip_rpc_with_json 00:06:59.613 ************************************ 00:06:59.613 14:19:55 skip_rpc -- rpc/skip_rpc.sh@75 -- # run_test skip_rpc_with_delay test_skip_rpc_with_delay 00:06:59.613 14:19:55 skip_rpc -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:59.613 14:19:55 skip_rpc -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:59.613 14:19:55 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:59.613 ************************************ 00:06:59.613 START TEST skip_rpc_with_delay 00:06:59.613 ************************************ 00:06:59.613 14:19:55 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@1129 -- # test_skip_rpc_with_delay 00:06:59.613 14:19:55 skip_rpc.skip_rpc_with_delay -- rpc/skip_rpc.sh@57 -- # NOT /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:06:59.613 14:19:55 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@652 -- # local es=0 00:06:59.613 14:19:55 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@654 -- # valid_exec_arg /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:06:59.614 14:19:55 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@640 -- # local arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:59.614 14:19:55 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:06:59.614 14:19:55 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@644 -- # type -t /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:59.614 14:19:55 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:06:59.614 14:19:55 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@646 -- # type -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:59.614 14:19:55 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:06:59.614 14:19:55 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@646 -- # arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:59.614 14:19:55 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@646 -- # [[ -x /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt ]] 00:06:59.614 14:19:55 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@655 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:06:59.614 [2024-11-18 14:19:55.618502] app.c: 842:spdk_app_start: *ERROR*: Cannot use '--wait-for-rpc' if no RPC server is going to be started. 00:06:59.614 14:19:55 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@655 -- # es=1 00:06:59.614 14:19:55 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@663 -- # (( es > 128 )) 00:06:59.614 14:19:55 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@674 -- # [[ -n '' ]] 00:06:59.614 14:19:55 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@679 -- # (( !es == 0 )) 00:06:59.614 00:06:59.614 real 0m0.047s 00:06:59.614 user 0m0.021s 00:06:59.614 sys 0m0.025s 00:06:59.614 14:19:55 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:59.614 14:19:55 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@10 -- # set +x 00:06:59.614 ************************************ 00:06:59.614 END TEST skip_rpc_with_delay 00:06:59.614 ************************************ 00:06:59.614 14:19:55 skip_rpc -- rpc/skip_rpc.sh@77 -- # uname 00:06:59.614 14:19:55 skip_rpc -- rpc/skip_rpc.sh@77 -- # '[' Linux '!=' FreeBSD ']' 00:06:59.614 14:19:55 skip_rpc -- rpc/skip_rpc.sh@78 -- # run_test exit_on_failed_rpc_init test_exit_on_failed_rpc_init 00:06:59.614 14:19:55 skip_rpc -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:59.614 14:19:55 skip_rpc -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:59.614 14:19:55 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:59.614 ************************************ 00:06:59.614 START TEST exit_on_failed_rpc_init 00:06:59.614 ************************************ 00:06:59.614 14:19:55 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@1129 -- # test_exit_on_failed_rpc_init 00:06:59.614 14:19:55 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@62 -- # local spdk_pid=311757 00:06:59.614 14:19:55 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@63 -- # waitforlisten 311757 00:06:59.614 14:19:55 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@61 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:59.614 14:19:55 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@835 -- # '[' -z 311757 ']' 00:06:59.614 14:19:55 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:59.614 14:19:55 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@840 -- # local max_retries=100 00:06:59.614 14:19:55 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:59.614 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:59.614 14:19:55 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@844 -- # xtrace_disable 00:06:59.614 14:19:55 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@10 -- # set +x 00:06:59.874 [2024-11-18 14:19:55.748232] Starting SPDK v25.01-pre git sha1 d47eb51c9 / DPDK 23.11.0 initialization... 00:06:59.875 [2024-11-18 14:19:55.748300] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid311757 ] 00:06:59.875 [2024-11-18 14:19:55.837997] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:59.875 [2024-11-18 14:19:55.862308] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:00.135 14:19:56 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:00.135 14:19:56 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@868 -- # return 0 00:07:00.135 14:19:56 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@65 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:07:00.135 14:19:56 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@67 -- # NOT /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:07:00.135 14:19:56 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@652 -- # local es=0 00:07:00.135 14:19:56 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@654 -- # valid_exec_arg /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:07:00.135 14:19:56 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@640 -- # local arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:07:00.135 14:19:56 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:07:00.135 14:19:56 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@644 -- # type -t /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:07:00.135 14:19:56 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:07:00.135 14:19:56 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@646 -- # type -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:07:00.135 14:19:56 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:07:00.135 14:19:56 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@646 -- # arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:07:00.135 14:19:56 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@646 -- # [[ -x /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt ]] 00:07:00.135 14:19:56 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@655 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:07:00.135 [2024-11-18 14:19:56.094513] Starting SPDK v25.01-pre git sha1 d47eb51c9 / DPDK 23.11.0 initialization... 00:07:00.135 [2024-11-18 14:19:56.094621] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid311772 ] 00:07:00.135 [2024-11-18 14:19:56.181796] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:00.135 [2024-11-18 14:19:56.203956] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 1 00:07:00.135 [2024-11-18 14:19:56.204042] rpc.c: 181:_spdk_rpc_listen: *ERROR*: RPC Unix domain socket path /var/tmp/spdk.sock in use. Specify another. 00:07:00.135 [2024-11-18 14:19:56.204060] rpc.c: 166:spdk_rpc_initialize: *ERROR*: Unable to start RPC service at /var/tmp/spdk.sock 00:07:00.135 [2024-11-18 14:19:56.204070] app.c:1064:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:07:00.135 14:19:56 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@655 -- # es=234 00:07:00.135 14:19:56 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@663 -- # (( es > 128 )) 00:07:00.135 14:19:56 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@664 -- # es=106 00:07:00.135 14:19:56 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@665 -- # case "$es" in 00:07:00.135 14:19:56 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@672 -- # es=1 00:07:00.135 14:19:56 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@679 -- # (( !es == 0 )) 00:07:00.135 14:19:56 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@69 -- # trap - SIGINT SIGTERM EXIT 00:07:00.135 14:19:56 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@70 -- # killprocess 311757 00:07:00.135 14:19:56 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@954 -- # '[' -z 311757 ']' 00:07:00.135 14:19:56 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@958 -- # kill -0 311757 00:07:00.135 14:19:56 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@959 -- # uname 00:07:00.135 14:19:56 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:00.135 14:19:56 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 311757 00:07:00.396 14:19:56 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:07:00.396 14:19:56 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:07:00.396 14:19:56 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@972 -- # echo 'killing process with pid 311757' 00:07:00.396 killing process with pid 311757 00:07:00.396 14:19:56 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@973 -- # kill 311757 00:07:00.396 14:19:56 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@978 -- # wait 311757 00:07:00.657 00:07:00.657 real 0m0.860s 00:07:00.657 user 0m0.857s 00:07:00.657 sys 0m0.424s 00:07:00.657 14:19:56 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:00.657 14:19:56 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@10 -- # set +x 00:07:00.657 ************************************ 00:07:00.657 END TEST exit_on_failed_rpc_init 00:07:00.657 ************************************ 00:07:00.657 14:19:56 skip_rpc -- rpc/skip_rpc.sh@81 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/config.json 00:07:00.657 00:07:00.657 real 0m13.045s 00:07:00.657 user 0m12.100s 00:07:00.657 sys 0m1.794s 00:07:00.657 14:19:56 skip_rpc -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:00.657 14:19:56 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:00.657 ************************************ 00:07:00.657 END TEST skip_rpc 00:07:00.657 ************************************ 00:07:00.657 14:19:56 -- spdk/autotest.sh@158 -- # run_test rpc_client /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:07:00.657 14:19:56 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:00.657 14:19:56 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:00.657 14:19:56 -- common/autotest_common.sh@10 -- # set +x 00:07:00.657 ************************************ 00:07:00.657 START TEST rpc_client 00:07:00.657 ************************************ 00:07:00.657 14:19:56 rpc_client -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:07:00.918 * Looking for test storage... 00:07:00.918 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client 00:07:00.918 14:19:56 rpc_client -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:07:00.918 14:19:56 rpc_client -- common/autotest_common.sh@1693 -- # lcov --version 00:07:00.918 14:19:56 rpc_client -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:07:00.918 14:19:56 rpc_client -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:07:00.918 14:19:56 rpc_client -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:00.918 14:19:56 rpc_client -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:00.918 14:19:56 rpc_client -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:00.918 14:19:56 rpc_client -- scripts/common.sh@336 -- # IFS=.-: 00:07:00.918 14:19:56 rpc_client -- scripts/common.sh@336 -- # read -ra ver1 00:07:00.918 14:19:56 rpc_client -- scripts/common.sh@337 -- # IFS=.-: 00:07:00.918 14:19:56 rpc_client -- scripts/common.sh@337 -- # read -ra ver2 00:07:00.918 14:19:56 rpc_client -- scripts/common.sh@338 -- # local 'op=<' 00:07:00.918 14:19:56 rpc_client -- scripts/common.sh@340 -- # ver1_l=2 00:07:00.918 14:19:56 rpc_client -- scripts/common.sh@341 -- # ver2_l=1 00:07:00.918 14:19:56 rpc_client -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:00.918 14:19:56 rpc_client -- scripts/common.sh@344 -- # case "$op" in 00:07:00.918 14:19:56 rpc_client -- scripts/common.sh@345 -- # : 1 00:07:00.918 14:19:56 rpc_client -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:00.918 14:19:56 rpc_client -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:00.918 14:19:56 rpc_client -- scripts/common.sh@365 -- # decimal 1 00:07:00.918 14:19:56 rpc_client -- scripts/common.sh@353 -- # local d=1 00:07:00.918 14:19:56 rpc_client -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:00.918 14:19:56 rpc_client -- scripts/common.sh@355 -- # echo 1 00:07:00.918 14:19:56 rpc_client -- scripts/common.sh@365 -- # ver1[v]=1 00:07:00.918 14:19:56 rpc_client -- scripts/common.sh@366 -- # decimal 2 00:07:00.918 14:19:56 rpc_client -- scripts/common.sh@353 -- # local d=2 00:07:00.918 14:19:56 rpc_client -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:00.918 14:19:56 rpc_client -- scripts/common.sh@355 -- # echo 2 00:07:00.918 14:19:56 rpc_client -- scripts/common.sh@366 -- # ver2[v]=2 00:07:00.918 14:19:56 rpc_client -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:00.918 14:19:56 rpc_client -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:00.918 14:19:56 rpc_client -- scripts/common.sh@368 -- # return 0 00:07:00.918 14:19:56 rpc_client -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:00.918 14:19:56 rpc_client -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:07:00.918 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:00.918 --rc genhtml_branch_coverage=1 00:07:00.918 --rc genhtml_function_coverage=1 00:07:00.918 --rc genhtml_legend=1 00:07:00.918 --rc geninfo_all_blocks=1 00:07:00.918 --rc geninfo_unexecuted_blocks=1 00:07:00.918 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:00.918 ' 00:07:00.918 14:19:56 rpc_client -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:07:00.918 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:00.918 --rc genhtml_branch_coverage=1 00:07:00.918 --rc genhtml_function_coverage=1 00:07:00.918 --rc genhtml_legend=1 00:07:00.918 --rc geninfo_all_blocks=1 00:07:00.918 --rc geninfo_unexecuted_blocks=1 00:07:00.918 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:00.918 ' 00:07:00.918 14:19:56 rpc_client -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:07:00.918 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:00.918 --rc genhtml_branch_coverage=1 00:07:00.918 --rc genhtml_function_coverage=1 00:07:00.918 --rc genhtml_legend=1 00:07:00.918 --rc geninfo_all_blocks=1 00:07:00.918 --rc geninfo_unexecuted_blocks=1 00:07:00.918 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:00.918 ' 00:07:00.918 14:19:56 rpc_client -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:07:00.918 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:00.918 --rc genhtml_branch_coverage=1 00:07:00.918 --rc genhtml_function_coverage=1 00:07:00.918 --rc genhtml_legend=1 00:07:00.918 --rc geninfo_all_blocks=1 00:07:00.918 --rc geninfo_unexecuted_blocks=1 00:07:00.918 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:00.918 ' 00:07:00.918 14:19:56 rpc_client -- rpc_client/rpc_client.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client_test 00:07:00.918 OK 00:07:00.918 14:19:56 rpc_client -- rpc_client/rpc_client.sh@12 -- # trap - SIGINT SIGTERM EXIT 00:07:00.918 00:07:00.918 real 0m0.223s 00:07:00.918 user 0m0.120s 00:07:00.918 sys 0m0.122s 00:07:00.918 14:19:56 rpc_client -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:00.918 14:19:56 rpc_client -- common/autotest_common.sh@10 -- # set +x 00:07:00.918 ************************************ 00:07:00.918 END TEST rpc_client 00:07:00.918 ************************************ 00:07:00.918 14:19:56 -- spdk/autotest.sh@159 -- # run_test json_config /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config.sh 00:07:00.918 14:19:56 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:00.918 14:19:56 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:00.918 14:19:56 -- common/autotest_common.sh@10 -- # set +x 00:07:00.918 ************************************ 00:07:00.918 START TEST json_config 00:07:00.918 ************************************ 00:07:00.918 14:19:57 json_config -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config.sh 00:07:01.179 14:19:57 json_config -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:07:01.179 14:19:57 json_config -- common/autotest_common.sh@1693 -- # lcov --version 00:07:01.179 14:19:57 json_config -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:07:01.179 14:19:57 json_config -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:07:01.179 14:19:57 json_config -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:01.180 14:19:57 json_config -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:01.180 14:19:57 json_config -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:01.180 14:19:57 json_config -- scripts/common.sh@336 -- # IFS=.-: 00:07:01.180 14:19:57 json_config -- scripts/common.sh@336 -- # read -ra ver1 00:07:01.180 14:19:57 json_config -- scripts/common.sh@337 -- # IFS=.-: 00:07:01.180 14:19:57 json_config -- scripts/common.sh@337 -- # read -ra ver2 00:07:01.180 14:19:57 json_config -- scripts/common.sh@338 -- # local 'op=<' 00:07:01.180 14:19:57 json_config -- scripts/common.sh@340 -- # ver1_l=2 00:07:01.180 14:19:57 json_config -- scripts/common.sh@341 -- # ver2_l=1 00:07:01.180 14:19:57 json_config -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:01.180 14:19:57 json_config -- scripts/common.sh@344 -- # case "$op" in 00:07:01.180 14:19:57 json_config -- scripts/common.sh@345 -- # : 1 00:07:01.180 14:19:57 json_config -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:01.180 14:19:57 json_config -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:01.180 14:19:57 json_config -- scripts/common.sh@365 -- # decimal 1 00:07:01.180 14:19:57 json_config -- scripts/common.sh@353 -- # local d=1 00:07:01.180 14:19:57 json_config -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:01.180 14:19:57 json_config -- scripts/common.sh@355 -- # echo 1 00:07:01.180 14:19:57 json_config -- scripts/common.sh@365 -- # ver1[v]=1 00:07:01.180 14:19:57 json_config -- scripts/common.sh@366 -- # decimal 2 00:07:01.180 14:19:57 json_config -- scripts/common.sh@353 -- # local d=2 00:07:01.180 14:19:57 json_config -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:01.180 14:19:57 json_config -- scripts/common.sh@355 -- # echo 2 00:07:01.180 14:19:57 json_config -- scripts/common.sh@366 -- # ver2[v]=2 00:07:01.180 14:19:57 json_config -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:01.180 14:19:57 json_config -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:01.180 14:19:57 json_config -- scripts/common.sh@368 -- # return 0 00:07:01.180 14:19:57 json_config -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:01.180 14:19:57 json_config -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:07:01.180 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:01.180 --rc genhtml_branch_coverage=1 00:07:01.180 --rc genhtml_function_coverage=1 00:07:01.180 --rc genhtml_legend=1 00:07:01.180 --rc geninfo_all_blocks=1 00:07:01.180 --rc geninfo_unexecuted_blocks=1 00:07:01.180 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:01.180 ' 00:07:01.180 14:19:57 json_config -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:07:01.180 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:01.180 --rc genhtml_branch_coverage=1 00:07:01.180 --rc genhtml_function_coverage=1 00:07:01.180 --rc genhtml_legend=1 00:07:01.180 --rc geninfo_all_blocks=1 00:07:01.180 --rc geninfo_unexecuted_blocks=1 00:07:01.180 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:01.180 ' 00:07:01.180 14:19:57 json_config -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:07:01.180 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:01.180 --rc genhtml_branch_coverage=1 00:07:01.180 --rc genhtml_function_coverage=1 00:07:01.180 --rc genhtml_legend=1 00:07:01.180 --rc geninfo_all_blocks=1 00:07:01.180 --rc geninfo_unexecuted_blocks=1 00:07:01.180 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:01.180 ' 00:07:01.180 14:19:57 json_config -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:07:01.180 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:01.180 --rc genhtml_branch_coverage=1 00:07:01.180 --rc genhtml_function_coverage=1 00:07:01.180 --rc genhtml_legend=1 00:07:01.180 --rc geninfo_all_blocks=1 00:07:01.180 --rc geninfo_unexecuted_blocks=1 00:07:01.180 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:01.180 ' 00:07:01.180 14:19:57 json_config -- json_config/json_config.sh@8 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:07:01.180 14:19:57 json_config -- nvmf/common.sh@7 -- # uname -s 00:07:01.180 14:19:57 json_config -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:07:01.180 14:19:57 json_config -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:07:01.180 14:19:57 json_config -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:07:01.180 14:19:57 json_config -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:07:01.180 14:19:57 json_config -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:07:01.180 14:19:57 json_config -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:07:01.180 14:19:57 json_config -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:07:01.180 14:19:57 json_config -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:07:01.180 14:19:57 json_config -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:07:01.180 14:19:57 json_config -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:07:01.180 14:19:57 json_config -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809b5fbc-4be7-e711-906e-0017a4403562 00:07:01.180 14:19:57 json_config -- nvmf/common.sh@18 -- # NVME_HOSTID=809b5fbc-4be7-e711-906e-0017a4403562 00:07:01.180 14:19:57 json_config -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:07:01.180 14:19:57 json_config -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:07:01.180 14:19:57 json_config -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:07:01.180 14:19:57 json_config -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:07:01.180 14:19:57 json_config -- nvmf/common.sh@49 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:07:01.180 14:19:57 json_config -- scripts/common.sh@15 -- # shopt -s extglob 00:07:01.180 14:19:57 json_config -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:07:01.180 14:19:57 json_config -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:07:01.180 14:19:57 json_config -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:07:01.180 14:19:57 json_config -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:01.180 14:19:57 json_config -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:01.180 14:19:57 json_config -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:01.180 14:19:57 json_config -- paths/export.sh@5 -- # export PATH 00:07:01.180 14:19:57 json_config -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:01.180 14:19:57 json_config -- nvmf/common.sh@51 -- # : 0 00:07:01.180 14:19:57 json_config -- nvmf/common.sh@52 -- # export NVMF_APP_SHM_ID 00:07:01.180 14:19:57 json_config -- nvmf/common.sh@53 -- # build_nvmf_app_args 00:07:01.180 14:19:57 json_config -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:07:01.180 14:19:57 json_config -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:07:01.180 14:19:57 json_config -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:07:01.180 14:19:57 json_config -- nvmf/common.sh@33 -- # '[' '' -eq 1 ']' 00:07:01.180 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh: line 33: [: : integer expression expected 00:07:01.180 14:19:57 json_config -- nvmf/common.sh@37 -- # '[' -n '' ']' 00:07:01.180 14:19:57 json_config -- nvmf/common.sh@39 -- # '[' 0 -eq 1 ']' 00:07:01.180 14:19:57 json_config -- nvmf/common.sh@55 -- # have_pci_nics=0 00:07:01.180 14:19:57 json_config -- json_config/json_config.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/common.sh 00:07:01.180 14:19:57 json_config -- json_config/json_config.sh@11 -- # [[ 0 -eq 1 ]] 00:07:01.180 14:19:57 json_config -- json_config/json_config.sh@15 -- # [[ 0 -ne 1 ]] 00:07:01.180 14:19:57 json_config -- json_config/json_config.sh@15 -- # [[ 0 -eq 1 ]] 00:07:01.180 14:19:57 json_config -- json_config/json_config.sh@26 -- # (( SPDK_TEST_BLOCKDEV + SPDK_TEST_ISCSI + SPDK_TEST_NVMF + SPDK_TEST_VHOST + SPDK_TEST_VHOST_INIT + SPDK_TEST_RBD == 0 )) 00:07:01.180 14:19:57 json_config -- json_config/json_config.sh@27 -- # echo 'WARNING: No tests are enabled so not running JSON configuration tests' 00:07:01.180 WARNING: No tests are enabled so not running JSON configuration tests 00:07:01.180 14:19:57 json_config -- json_config/json_config.sh@28 -- # exit 0 00:07:01.180 00:07:01.180 real 0m0.201s 00:07:01.180 user 0m0.117s 00:07:01.180 sys 0m0.093s 00:07:01.180 14:19:57 json_config -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:01.180 14:19:57 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:01.180 ************************************ 00:07:01.180 END TEST json_config 00:07:01.180 ************************************ 00:07:01.180 14:19:57 -- spdk/autotest.sh@160 -- # run_test json_config_extra_key /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:07:01.180 14:19:57 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:01.180 14:19:57 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:01.180 14:19:57 -- common/autotest_common.sh@10 -- # set +x 00:07:01.180 ************************************ 00:07:01.180 START TEST json_config_extra_key 00:07:01.180 ************************************ 00:07:01.180 14:19:57 json_config_extra_key -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:07:01.441 14:19:57 json_config_extra_key -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:07:01.441 14:19:57 json_config_extra_key -- common/autotest_common.sh@1693 -- # lcov --version 00:07:01.441 14:19:57 json_config_extra_key -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:07:01.441 14:19:57 json_config_extra_key -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:07:01.441 14:19:57 json_config_extra_key -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:01.441 14:19:57 json_config_extra_key -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:01.441 14:19:57 json_config_extra_key -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:01.441 14:19:57 json_config_extra_key -- scripts/common.sh@336 -- # IFS=.-: 00:07:01.441 14:19:57 json_config_extra_key -- scripts/common.sh@336 -- # read -ra ver1 00:07:01.441 14:19:57 json_config_extra_key -- scripts/common.sh@337 -- # IFS=.-: 00:07:01.441 14:19:57 json_config_extra_key -- scripts/common.sh@337 -- # read -ra ver2 00:07:01.441 14:19:57 json_config_extra_key -- scripts/common.sh@338 -- # local 'op=<' 00:07:01.441 14:19:57 json_config_extra_key -- scripts/common.sh@340 -- # ver1_l=2 00:07:01.441 14:19:57 json_config_extra_key -- scripts/common.sh@341 -- # ver2_l=1 00:07:01.441 14:19:57 json_config_extra_key -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:01.441 14:19:57 json_config_extra_key -- scripts/common.sh@344 -- # case "$op" in 00:07:01.441 14:19:57 json_config_extra_key -- scripts/common.sh@345 -- # : 1 00:07:01.441 14:19:57 json_config_extra_key -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:01.441 14:19:57 json_config_extra_key -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:01.441 14:19:57 json_config_extra_key -- scripts/common.sh@365 -- # decimal 1 00:07:01.441 14:19:57 json_config_extra_key -- scripts/common.sh@353 -- # local d=1 00:07:01.441 14:19:57 json_config_extra_key -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:01.441 14:19:57 json_config_extra_key -- scripts/common.sh@355 -- # echo 1 00:07:01.441 14:19:57 json_config_extra_key -- scripts/common.sh@365 -- # ver1[v]=1 00:07:01.441 14:19:57 json_config_extra_key -- scripts/common.sh@366 -- # decimal 2 00:07:01.441 14:19:57 json_config_extra_key -- scripts/common.sh@353 -- # local d=2 00:07:01.441 14:19:57 json_config_extra_key -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:01.441 14:19:57 json_config_extra_key -- scripts/common.sh@355 -- # echo 2 00:07:01.441 14:19:57 json_config_extra_key -- scripts/common.sh@366 -- # ver2[v]=2 00:07:01.441 14:19:57 json_config_extra_key -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:01.441 14:19:57 json_config_extra_key -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:01.441 14:19:57 json_config_extra_key -- scripts/common.sh@368 -- # return 0 00:07:01.441 14:19:57 json_config_extra_key -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:01.441 14:19:57 json_config_extra_key -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:07:01.441 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:01.441 --rc genhtml_branch_coverage=1 00:07:01.441 --rc genhtml_function_coverage=1 00:07:01.441 --rc genhtml_legend=1 00:07:01.441 --rc geninfo_all_blocks=1 00:07:01.441 --rc geninfo_unexecuted_blocks=1 00:07:01.441 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:01.441 ' 00:07:01.441 14:19:57 json_config_extra_key -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:07:01.441 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:01.441 --rc genhtml_branch_coverage=1 00:07:01.441 --rc genhtml_function_coverage=1 00:07:01.441 --rc genhtml_legend=1 00:07:01.441 --rc geninfo_all_blocks=1 00:07:01.441 --rc geninfo_unexecuted_blocks=1 00:07:01.441 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:01.441 ' 00:07:01.441 14:19:57 json_config_extra_key -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:07:01.441 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:01.441 --rc genhtml_branch_coverage=1 00:07:01.441 --rc genhtml_function_coverage=1 00:07:01.441 --rc genhtml_legend=1 00:07:01.441 --rc geninfo_all_blocks=1 00:07:01.441 --rc geninfo_unexecuted_blocks=1 00:07:01.441 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:01.441 ' 00:07:01.441 14:19:57 json_config_extra_key -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:07:01.441 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:01.441 --rc genhtml_branch_coverage=1 00:07:01.441 --rc genhtml_function_coverage=1 00:07:01.441 --rc genhtml_legend=1 00:07:01.441 --rc geninfo_all_blocks=1 00:07:01.441 --rc geninfo_unexecuted_blocks=1 00:07:01.441 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:01.441 ' 00:07:01.441 14:19:57 json_config_extra_key -- json_config/json_config_extra_key.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:07:01.441 14:19:57 json_config_extra_key -- nvmf/common.sh@7 -- # uname -s 00:07:01.442 14:19:57 json_config_extra_key -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:07:01.442 14:19:57 json_config_extra_key -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:07:01.442 14:19:57 json_config_extra_key -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:07:01.442 14:19:57 json_config_extra_key -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:07:01.442 14:19:57 json_config_extra_key -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:07:01.442 14:19:57 json_config_extra_key -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:07:01.442 14:19:57 json_config_extra_key -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:07:01.442 14:19:57 json_config_extra_key -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:07:01.442 14:19:57 json_config_extra_key -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:07:01.442 14:19:57 json_config_extra_key -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:07:01.442 14:19:57 json_config_extra_key -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809b5fbc-4be7-e711-906e-0017a4403562 00:07:01.442 14:19:57 json_config_extra_key -- nvmf/common.sh@18 -- # NVME_HOSTID=809b5fbc-4be7-e711-906e-0017a4403562 00:07:01.442 14:19:57 json_config_extra_key -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:07:01.442 14:19:57 json_config_extra_key -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:07:01.442 14:19:57 json_config_extra_key -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:07:01.442 14:19:57 json_config_extra_key -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:07:01.442 14:19:57 json_config_extra_key -- nvmf/common.sh@49 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:07:01.442 14:19:57 json_config_extra_key -- scripts/common.sh@15 -- # shopt -s extglob 00:07:01.442 14:19:57 json_config_extra_key -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:07:01.442 14:19:57 json_config_extra_key -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:07:01.442 14:19:57 json_config_extra_key -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:07:01.442 14:19:57 json_config_extra_key -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:01.442 14:19:57 json_config_extra_key -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:01.442 14:19:57 json_config_extra_key -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:01.442 14:19:57 json_config_extra_key -- paths/export.sh@5 -- # export PATH 00:07:01.442 14:19:57 json_config_extra_key -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:01.442 14:19:57 json_config_extra_key -- nvmf/common.sh@51 -- # : 0 00:07:01.442 14:19:57 json_config_extra_key -- nvmf/common.sh@52 -- # export NVMF_APP_SHM_ID 00:07:01.442 14:19:57 json_config_extra_key -- nvmf/common.sh@53 -- # build_nvmf_app_args 00:07:01.442 14:19:57 json_config_extra_key -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:07:01.442 14:19:57 json_config_extra_key -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:07:01.442 14:19:57 json_config_extra_key -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:07:01.442 14:19:57 json_config_extra_key -- nvmf/common.sh@33 -- # '[' '' -eq 1 ']' 00:07:01.442 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh: line 33: [: : integer expression expected 00:07:01.442 14:19:57 json_config_extra_key -- nvmf/common.sh@37 -- # '[' -n '' ']' 00:07:01.442 14:19:57 json_config_extra_key -- nvmf/common.sh@39 -- # '[' 0 -eq 1 ']' 00:07:01.442 14:19:57 json_config_extra_key -- nvmf/common.sh@55 -- # have_pci_nics=0 00:07:01.442 14:19:57 json_config_extra_key -- json_config/json_config_extra_key.sh@10 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/common.sh 00:07:01.442 14:19:57 json_config_extra_key -- json_config/json_config_extra_key.sh@17 -- # app_pid=(['target']='') 00:07:01.442 14:19:57 json_config_extra_key -- json_config/json_config_extra_key.sh@17 -- # declare -A app_pid 00:07:01.442 14:19:57 json_config_extra_key -- json_config/json_config_extra_key.sh@18 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock') 00:07:01.442 14:19:57 json_config_extra_key -- json_config/json_config_extra_key.sh@18 -- # declare -A app_socket 00:07:01.442 14:19:57 json_config_extra_key -- json_config/json_config_extra_key.sh@19 -- # app_params=(['target']='-m 0x1 -s 1024') 00:07:01.442 14:19:57 json_config_extra_key -- json_config/json_config_extra_key.sh@19 -- # declare -A app_params 00:07:01.442 14:19:57 json_config_extra_key -- json_config/json_config_extra_key.sh@20 -- # configs_path=(['target']='/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json') 00:07:01.442 14:19:57 json_config_extra_key -- json_config/json_config_extra_key.sh@20 -- # declare -A configs_path 00:07:01.442 14:19:57 json_config_extra_key -- json_config/json_config_extra_key.sh@22 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:07:01.442 14:19:57 json_config_extra_key -- json_config/json_config_extra_key.sh@24 -- # echo 'INFO: launching applications...' 00:07:01.442 INFO: launching applications... 00:07:01.442 14:19:57 json_config_extra_key -- json_config/json_config_extra_key.sh@25 -- # json_config_test_start_app target --json /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json 00:07:01.442 14:19:57 json_config_extra_key -- json_config/common.sh@9 -- # local app=target 00:07:01.442 14:19:57 json_config_extra_key -- json_config/common.sh@10 -- # shift 00:07:01.442 14:19:57 json_config_extra_key -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:07:01.442 14:19:57 json_config_extra_key -- json_config/common.sh@13 -- # [[ -z '' ]] 00:07:01.442 14:19:57 json_config_extra_key -- json_config/common.sh@15 -- # local app_extra_params= 00:07:01.442 14:19:57 json_config_extra_key -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:07:01.442 14:19:57 json_config_extra_key -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:07:01.442 14:19:57 json_config_extra_key -- json_config/common.sh@22 -- # app_pid["$app"]=312209 00:07:01.442 14:19:57 json_config_extra_key -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:07:01.442 Waiting for target to run... 00:07:01.442 14:19:57 json_config_extra_key -- json_config/common.sh@25 -- # waitforlisten 312209 /var/tmp/spdk_tgt.sock 00:07:01.442 14:19:57 json_config_extra_key -- common/autotest_common.sh@835 -- # '[' -z 312209 ']' 00:07:01.442 14:19:57 json_config_extra_key -- json_config/common.sh@21 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json 00:07:01.442 14:19:57 json_config_extra_key -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:07:01.442 14:19:57 json_config_extra_key -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:01.442 14:19:57 json_config_extra_key -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:07:01.442 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:07:01.442 14:19:57 json_config_extra_key -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:01.442 14:19:57 json_config_extra_key -- common/autotest_common.sh@10 -- # set +x 00:07:01.442 [2024-11-18 14:19:57.524477] Starting SPDK v25.01-pre git sha1 d47eb51c9 / DPDK 23.11.0 initialization... 00:07:01.442 [2024-11-18 14:19:57.524573] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid312209 ] 00:07:01.702 [2024-11-18 14:19:57.814841] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:01.702 [2024-11-18 14:19:57.827400] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:02.271 14:19:58 json_config_extra_key -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:02.271 14:19:58 json_config_extra_key -- common/autotest_common.sh@868 -- # return 0 00:07:02.271 14:19:58 json_config_extra_key -- json_config/common.sh@26 -- # echo '' 00:07:02.271 00:07:02.271 14:19:58 json_config_extra_key -- json_config/json_config_extra_key.sh@27 -- # echo 'INFO: shutting down applications...' 00:07:02.271 INFO: shutting down applications... 00:07:02.271 14:19:58 json_config_extra_key -- json_config/json_config_extra_key.sh@28 -- # json_config_test_shutdown_app target 00:07:02.271 14:19:58 json_config_extra_key -- json_config/common.sh@31 -- # local app=target 00:07:02.271 14:19:58 json_config_extra_key -- json_config/common.sh@34 -- # [[ -n 22 ]] 00:07:02.271 14:19:58 json_config_extra_key -- json_config/common.sh@35 -- # [[ -n 312209 ]] 00:07:02.271 14:19:58 json_config_extra_key -- json_config/common.sh@38 -- # kill -SIGINT 312209 00:07:02.271 14:19:58 json_config_extra_key -- json_config/common.sh@40 -- # (( i = 0 )) 00:07:02.271 14:19:58 json_config_extra_key -- json_config/common.sh@40 -- # (( i < 30 )) 00:07:02.271 14:19:58 json_config_extra_key -- json_config/common.sh@41 -- # kill -0 312209 00:07:02.271 14:19:58 json_config_extra_key -- json_config/common.sh@45 -- # sleep 0.5 00:07:02.842 14:19:58 json_config_extra_key -- json_config/common.sh@40 -- # (( i++ )) 00:07:02.842 14:19:58 json_config_extra_key -- json_config/common.sh@40 -- # (( i < 30 )) 00:07:02.842 14:19:58 json_config_extra_key -- json_config/common.sh@41 -- # kill -0 312209 00:07:02.842 14:19:58 json_config_extra_key -- json_config/common.sh@42 -- # app_pid["$app"]= 00:07:02.842 14:19:58 json_config_extra_key -- json_config/common.sh@43 -- # break 00:07:02.842 14:19:58 json_config_extra_key -- json_config/common.sh@48 -- # [[ -n '' ]] 00:07:02.842 14:19:58 json_config_extra_key -- json_config/common.sh@53 -- # echo 'SPDK target shutdown done' 00:07:02.842 SPDK target shutdown done 00:07:02.842 14:19:58 json_config_extra_key -- json_config/json_config_extra_key.sh@30 -- # echo Success 00:07:02.842 Success 00:07:02.842 00:07:02.842 real 0m1.575s 00:07:02.842 user 0m1.294s 00:07:02.842 sys 0m0.437s 00:07:02.842 14:19:58 json_config_extra_key -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:02.842 14:19:58 json_config_extra_key -- common/autotest_common.sh@10 -- # set +x 00:07:02.842 ************************************ 00:07:02.842 END TEST json_config_extra_key 00:07:02.842 ************************************ 00:07:02.842 14:19:58 -- spdk/autotest.sh@161 -- # run_test alias_rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:07:02.842 14:19:58 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:02.842 14:19:58 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:02.842 14:19:58 -- common/autotest_common.sh@10 -- # set +x 00:07:02.842 ************************************ 00:07:02.842 START TEST alias_rpc 00:07:02.842 ************************************ 00:07:02.842 14:19:58 alias_rpc -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:07:03.103 * Looking for test storage... 00:07:03.103 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc 00:07:03.103 14:19:59 alias_rpc -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:07:03.103 14:19:59 alias_rpc -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:07:03.103 14:19:59 alias_rpc -- common/autotest_common.sh@1693 -- # lcov --version 00:07:03.103 14:19:59 alias_rpc -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:07:03.103 14:19:59 alias_rpc -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:03.103 14:19:59 alias_rpc -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:03.103 14:19:59 alias_rpc -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:03.103 14:19:59 alias_rpc -- scripts/common.sh@336 -- # IFS=.-: 00:07:03.103 14:19:59 alias_rpc -- scripts/common.sh@336 -- # read -ra ver1 00:07:03.103 14:19:59 alias_rpc -- scripts/common.sh@337 -- # IFS=.-: 00:07:03.103 14:19:59 alias_rpc -- scripts/common.sh@337 -- # read -ra ver2 00:07:03.103 14:19:59 alias_rpc -- scripts/common.sh@338 -- # local 'op=<' 00:07:03.103 14:19:59 alias_rpc -- scripts/common.sh@340 -- # ver1_l=2 00:07:03.103 14:19:59 alias_rpc -- scripts/common.sh@341 -- # ver2_l=1 00:07:03.103 14:19:59 alias_rpc -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:03.103 14:19:59 alias_rpc -- scripts/common.sh@344 -- # case "$op" in 00:07:03.103 14:19:59 alias_rpc -- scripts/common.sh@345 -- # : 1 00:07:03.103 14:19:59 alias_rpc -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:03.103 14:19:59 alias_rpc -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:03.103 14:19:59 alias_rpc -- scripts/common.sh@365 -- # decimal 1 00:07:03.103 14:19:59 alias_rpc -- scripts/common.sh@353 -- # local d=1 00:07:03.103 14:19:59 alias_rpc -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:03.103 14:19:59 alias_rpc -- scripts/common.sh@355 -- # echo 1 00:07:03.103 14:19:59 alias_rpc -- scripts/common.sh@365 -- # ver1[v]=1 00:07:03.103 14:19:59 alias_rpc -- scripts/common.sh@366 -- # decimal 2 00:07:03.103 14:19:59 alias_rpc -- scripts/common.sh@353 -- # local d=2 00:07:03.103 14:19:59 alias_rpc -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:03.103 14:19:59 alias_rpc -- scripts/common.sh@355 -- # echo 2 00:07:03.103 14:19:59 alias_rpc -- scripts/common.sh@366 -- # ver2[v]=2 00:07:03.103 14:19:59 alias_rpc -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:03.103 14:19:59 alias_rpc -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:03.103 14:19:59 alias_rpc -- scripts/common.sh@368 -- # return 0 00:07:03.103 14:19:59 alias_rpc -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:03.103 14:19:59 alias_rpc -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:07:03.103 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:03.103 --rc genhtml_branch_coverage=1 00:07:03.103 --rc genhtml_function_coverage=1 00:07:03.103 --rc genhtml_legend=1 00:07:03.103 --rc geninfo_all_blocks=1 00:07:03.103 --rc geninfo_unexecuted_blocks=1 00:07:03.103 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:03.103 ' 00:07:03.103 14:19:59 alias_rpc -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:07:03.103 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:03.103 --rc genhtml_branch_coverage=1 00:07:03.103 --rc genhtml_function_coverage=1 00:07:03.103 --rc genhtml_legend=1 00:07:03.103 --rc geninfo_all_blocks=1 00:07:03.103 --rc geninfo_unexecuted_blocks=1 00:07:03.103 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:03.103 ' 00:07:03.103 14:19:59 alias_rpc -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:07:03.103 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:03.103 --rc genhtml_branch_coverage=1 00:07:03.103 --rc genhtml_function_coverage=1 00:07:03.103 --rc genhtml_legend=1 00:07:03.103 --rc geninfo_all_blocks=1 00:07:03.103 --rc geninfo_unexecuted_blocks=1 00:07:03.103 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:03.103 ' 00:07:03.103 14:19:59 alias_rpc -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:07:03.103 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:03.103 --rc genhtml_branch_coverage=1 00:07:03.103 --rc genhtml_function_coverage=1 00:07:03.103 --rc genhtml_legend=1 00:07:03.103 --rc geninfo_all_blocks=1 00:07:03.103 --rc geninfo_unexecuted_blocks=1 00:07:03.103 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:03.103 ' 00:07:03.103 14:19:59 alias_rpc -- alias_rpc/alias_rpc.sh@10 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:07:03.103 14:19:59 alias_rpc -- alias_rpc/alias_rpc.sh@13 -- # spdk_tgt_pid=312526 00:07:03.103 14:19:59 alias_rpc -- alias_rpc/alias_rpc.sh@14 -- # waitforlisten 312526 00:07:03.103 14:19:59 alias_rpc -- alias_rpc/alias_rpc.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:07:03.103 14:19:59 alias_rpc -- common/autotest_common.sh@835 -- # '[' -z 312526 ']' 00:07:03.103 14:19:59 alias_rpc -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:03.103 14:19:59 alias_rpc -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:03.103 14:19:59 alias_rpc -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:03.103 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:03.103 14:19:59 alias_rpc -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:03.103 14:19:59 alias_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:03.103 [2024-11-18 14:19:59.183577] Starting SPDK v25.01-pre git sha1 d47eb51c9 / DPDK 23.11.0 initialization... 00:07:03.103 [2024-11-18 14:19:59.183634] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid312526 ] 00:07:03.365 [2024-11-18 14:19:59.268643] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:03.365 [2024-11-18 14:19:59.290843] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:03.365 14:19:59 alias_rpc -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:03.365 14:19:59 alias_rpc -- common/autotest_common.sh@868 -- # return 0 00:07:03.366 14:19:59 alias_rpc -- alias_rpc/alias_rpc.sh@17 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py load_config -i 00:07:03.625 14:19:59 alias_rpc -- alias_rpc/alias_rpc.sh@19 -- # killprocess 312526 00:07:03.625 14:19:59 alias_rpc -- common/autotest_common.sh@954 -- # '[' -z 312526 ']' 00:07:03.625 14:19:59 alias_rpc -- common/autotest_common.sh@958 -- # kill -0 312526 00:07:03.625 14:19:59 alias_rpc -- common/autotest_common.sh@959 -- # uname 00:07:03.625 14:19:59 alias_rpc -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:03.625 14:19:59 alias_rpc -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 312526 00:07:03.885 14:19:59 alias_rpc -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:07:03.885 14:19:59 alias_rpc -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:07:03.885 14:19:59 alias_rpc -- common/autotest_common.sh@972 -- # echo 'killing process with pid 312526' 00:07:03.885 killing process with pid 312526 00:07:03.885 14:19:59 alias_rpc -- common/autotest_common.sh@973 -- # kill 312526 00:07:03.885 14:19:59 alias_rpc -- common/autotest_common.sh@978 -- # wait 312526 00:07:04.145 00:07:04.145 real 0m1.102s 00:07:04.145 user 0m1.065s 00:07:04.145 sys 0m0.485s 00:07:04.145 14:20:00 alias_rpc -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:04.145 14:20:00 alias_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:04.145 ************************************ 00:07:04.145 END TEST alias_rpc 00:07:04.145 ************************************ 00:07:04.145 14:20:00 -- spdk/autotest.sh@163 -- # [[ 0 -eq 0 ]] 00:07:04.145 14:20:00 -- spdk/autotest.sh@164 -- # run_test spdkcli_tcp /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/tcp.sh 00:07:04.145 14:20:00 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:04.145 14:20:00 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:04.145 14:20:00 -- common/autotest_common.sh@10 -- # set +x 00:07:04.145 ************************************ 00:07:04.145 START TEST spdkcli_tcp 00:07:04.145 ************************************ 00:07:04.145 14:20:00 spdkcli_tcp -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/tcp.sh 00:07:04.145 * Looking for test storage... 00:07:04.145 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli 00:07:04.145 14:20:00 spdkcli_tcp -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:07:04.145 14:20:00 spdkcli_tcp -- common/autotest_common.sh@1693 -- # lcov --version 00:07:04.145 14:20:00 spdkcli_tcp -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:07:04.405 14:20:00 spdkcli_tcp -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:07:04.405 14:20:00 spdkcli_tcp -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:04.405 14:20:00 spdkcli_tcp -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:04.405 14:20:00 spdkcli_tcp -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:04.405 14:20:00 spdkcli_tcp -- scripts/common.sh@336 -- # IFS=.-: 00:07:04.405 14:20:00 spdkcli_tcp -- scripts/common.sh@336 -- # read -ra ver1 00:07:04.405 14:20:00 spdkcli_tcp -- scripts/common.sh@337 -- # IFS=.-: 00:07:04.405 14:20:00 spdkcli_tcp -- scripts/common.sh@337 -- # read -ra ver2 00:07:04.405 14:20:00 spdkcli_tcp -- scripts/common.sh@338 -- # local 'op=<' 00:07:04.405 14:20:00 spdkcli_tcp -- scripts/common.sh@340 -- # ver1_l=2 00:07:04.405 14:20:00 spdkcli_tcp -- scripts/common.sh@341 -- # ver2_l=1 00:07:04.405 14:20:00 spdkcli_tcp -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:04.405 14:20:00 spdkcli_tcp -- scripts/common.sh@344 -- # case "$op" in 00:07:04.405 14:20:00 spdkcli_tcp -- scripts/common.sh@345 -- # : 1 00:07:04.405 14:20:00 spdkcli_tcp -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:04.405 14:20:00 spdkcli_tcp -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:04.405 14:20:00 spdkcli_tcp -- scripts/common.sh@365 -- # decimal 1 00:07:04.405 14:20:00 spdkcli_tcp -- scripts/common.sh@353 -- # local d=1 00:07:04.405 14:20:00 spdkcli_tcp -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:04.405 14:20:00 spdkcli_tcp -- scripts/common.sh@355 -- # echo 1 00:07:04.405 14:20:00 spdkcli_tcp -- scripts/common.sh@365 -- # ver1[v]=1 00:07:04.405 14:20:00 spdkcli_tcp -- scripts/common.sh@366 -- # decimal 2 00:07:04.405 14:20:00 spdkcli_tcp -- scripts/common.sh@353 -- # local d=2 00:07:04.405 14:20:00 spdkcli_tcp -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:04.405 14:20:00 spdkcli_tcp -- scripts/common.sh@355 -- # echo 2 00:07:04.405 14:20:00 spdkcli_tcp -- scripts/common.sh@366 -- # ver2[v]=2 00:07:04.405 14:20:00 spdkcli_tcp -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:04.405 14:20:00 spdkcli_tcp -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:04.405 14:20:00 spdkcli_tcp -- scripts/common.sh@368 -- # return 0 00:07:04.405 14:20:00 spdkcli_tcp -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:04.405 14:20:00 spdkcli_tcp -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:07:04.405 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:04.405 --rc genhtml_branch_coverage=1 00:07:04.405 --rc genhtml_function_coverage=1 00:07:04.405 --rc genhtml_legend=1 00:07:04.405 --rc geninfo_all_blocks=1 00:07:04.405 --rc geninfo_unexecuted_blocks=1 00:07:04.405 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:04.405 ' 00:07:04.405 14:20:00 spdkcli_tcp -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:07:04.405 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:04.405 --rc genhtml_branch_coverage=1 00:07:04.405 --rc genhtml_function_coverage=1 00:07:04.405 --rc genhtml_legend=1 00:07:04.405 --rc geninfo_all_blocks=1 00:07:04.405 --rc geninfo_unexecuted_blocks=1 00:07:04.405 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:04.405 ' 00:07:04.405 14:20:00 spdkcli_tcp -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:07:04.405 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:04.405 --rc genhtml_branch_coverage=1 00:07:04.405 --rc genhtml_function_coverage=1 00:07:04.405 --rc genhtml_legend=1 00:07:04.405 --rc geninfo_all_blocks=1 00:07:04.405 --rc geninfo_unexecuted_blocks=1 00:07:04.405 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:04.405 ' 00:07:04.405 14:20:00 spdkcli_tcp -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:07:04.405 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:04.405 --rc genhtml_branch_coverage=1 00:07:04.405 --rc genhtml_function_coverage=1 00:07:04.405 --rc genhtml_legend=1 00:07:04.405 --rc geninfo_all_blocks=1 00:07:04.405 --rc geninfo_unexecuted_blocks=1 00:07:04.405 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:04.405 ' 00:07:04.405 14:20:00 spdkcli_tcp -- spdkcli/tcp.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/common.sh 00:07:04.405 14:20:00 spdkcli_tcp -- spdkcli/common.sh@6 -- # spdkcli_job=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/spdkcli_job.py 00:07:04.405 14:20:00 spdkcli_tcp -- spdkcli/common.sh@7 -- # spdk_clear_config_py=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/clear_config.py 00:07:04.405 14:20:00 spdkcli_tcp -- spdkcli/tcp.sh@18 -- # IP_ADDRESS=127.0.0.1 00:07:04.405 14:20:00 spdkcli_tcp -- spdkcli/tcp.sh@19 -- # PORT=9998 00:07:04.406 14:20:00 spdkcli_tcp -- spdkcli/tcp.sh@21 -- # trap 'err_cleanup; exit 1' SIGINT SIGTERM EXIT 00:07:04.406 14:20:00 spdkcli_tcp -- spdkcli/tcp.sh@23 -- # timing_enter run_spdk_tgt_tcp 00:07:04.406 14:20:00 spdkcli_tcp -- common/autotest_common.sh@726 -- # xtrace_disable 00:07:04.406 14:20:00 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:07:04.406 14:20:00 spdkcli_tcp -- spdkcli/tcp.sh@25 -- # spdk_tgt_pid=312857 00:07:04.406 14:20:00 spdkcli_tcp -- spdkcli/tcp.sh@27 -- # waitforlisten 312857 00:07:04.406 14:20:00 spdkcli_tcp -- spdkcli/tcp.sh@24 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x3 -p 0 00:07:04.406 14:20:00 spdkcli_tcp -- common/autotest_common.sh@835 -- # '[' -z 312857 ']' 00:07:04.406 14:20:00 spdkcli_tcp -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:04.406 14:20:00 spdkcli_tcp -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:04.406 14:20:00 spdkcli_tcp -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:04.406 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:04.406 14:20:00 spdkcli_tcp -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:04.406 14:20:00 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:07:04.406 [2024-11-18 14:20:00.379110] Starting SPDK v25.01-pre git sha1 d47eb51c9 / DPDK 23.11.0 initialization... 00:07:04.406 [2024-11-18 14:20:00.379176] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid312857 ] 00:07:04.406 [2024-11-18 14:20:00.463616] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:04.406 [2024-11-18 14:20:00.487233] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:04.406 [2024-11-18 14:20:00.487233] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 1 00:07:04.666 14:20:00 spdkcli_tcp -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:04.666 14:20:00 spdkcli_tcp -- common/autotest_common.sh@868 -- # return 0 00:07:04.666 14:20:00 spdkcli_tcp -- spdkcli/tcp.sh@31 -- # socat_pid=312864 00:07:04.666 14:20:00 spdkcli_tcp -- spdkcli/tcp.sh@33 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -r 100 -t 2 -s 127.0.0.1 -p 9998 rpc_get_methods 00:07:04.666 14:20:00 spdkcli_tcp -- spdkcli/tcp.sh@30 -- # socat TCP-LISTEN:9998 UNIX-CONNECT:/var/tmp/spdk.sock 00:07:04.926 [ 00:07:04.926 "spdk_get_version", 00:07:04.926 "rpc_get_methods", 00:07:04.926 "notify_get_notifications", 00:07:04.926 "notify_get_types", 00:07:04.926 "trace_get_info", 00:07:04.926 "trace_get_tpoint_group_mask", 00:07:04.926 "trace_disable_tpoint_group", 00:07:04.926 "trace_enable_tpoint_group", 00:07:04.926 "trace_clear_tpoint_mask", 00:07:04.926 "trace_set_tpoint_mask", 00:07:04.926 "fsdev_set_opts", 00:07:04.926 "fsdev_get_opts", 00:07:04.926 "framework_get_pci_devices", 00:07:04.926 "framework_get_config", 00:07:04.926 "framework_get_subsystems", 00:07:04.926 "vfu_tgt_set_base_path", 00:07:04.926 "keyring_get_keys", 00:07:04.926 "iobuf_get_stats", 00:07:04.926 "iobuf_set_options", 00:07:04.926 "sock_get_default_impl", 00:07:04.926 "sock_set_default_impl", 00:07:04.926 "sock_impl_set_options", 00:07:04.926 "sock_impl_get_options", 00:07:04.926 "vmd_rescan", 00:07:04.926 "vmd_remove_device", 00:07:04.926 "vmd_enable", 00:07:04.926 "accel_get_stats", 00:07:04.926 "accel_set_options", 00:07:04.926 "accel_set_driver", 00:07:04.926 "accel_crypto_key_destroy", 00:07:04.926 "accel_crypto_keys_get", 00:07:04.926 "accel_crypto_key_create", 00:07:04.926 "accel_assign_opc", 00:07:04.926 "accel_get_module_info", 00:07:04.926 "accel_get_opc_assignments", 00:07:04.926 "bdev_get_histogram", 00:07:04.926 "bdev_enable_histogram", 00:07:04.926 "bdev_set_qos_limit", 00:07:04.926 "bdev_set_qd_sampling_period", 00:07:04.926 "bdev_get_bdevs", 00:07:04.926 "bdev_reset_iostat", 00:07:04.926 "bdev_get_iostat", 00:07:04.926 "bdev_examine", 00:07:04.926 "bdev_wait_for_examine", 00:07:04.926 "bdev_set_options", 00:07:04.926 "scsi_get_devices", 00:07:04.926 "thread_set_cpumask", 00:07:04.926 "scheduler_set_options", 00:07:04.926 "framework_get_governor", 00:07:04.926 "framework_get_scheduler", 00:07:04.926 "framework_set_scheduler", 00:07:04.926 "framework_get_reactors", 00:07:04.926 "thread_get_io_channels", 00:07:04.926 "thread_get_pollers", 00:07:04.926 "thread_get_stats", 00:07:04.926 "framework_monitor_context_switch", 00:07:04.926 "spdk_kill_instance", 00:07:04.926 "log_enable_timestamps", 00:07:04.926 "log_get_flags", 00:07:04.926 "log_clear_flag", 00:07:04.926 "log_set_flag", 00:07:04.926 "log_get_level", 00:07:04.926 "log_set_level", 00:07:04.926 "log_get_print_level", 00:07:04.926 "log_set_print_level", 00:07:04.926 "framework_enable_cpumask_locks", 00:07:04.926 "framework_disable_cpumask_locks", 00:07:04.927 "framework_wait_init", 00:07:04.927 "framework_start_init", 00:07:04.927 "virtio_blk_create_transport", 00:07:04.927 "virtio_blk_get_transports", 00:07:04.927 "vhost_controller_set_coalescing", 00:07:04.927 "vhost_get_controllers", 00:07:04.927 "vhost_delete_controller", 00:07:04.927 "vhost_create_blk_controller", 00:07:04.927 "vhost_scsi_controller_remove_target", 00:07:04.927 "vhost_scsi_controller_add_target", 00:07:04.927 "vhost_start_scsi_controller", 00:07:04.927 "vhost_create_scsi_controller", 00:07:04.927 "ublk_recover_disk", 00:07:04.927 "ublk_get_disks", 00:07:04.927 "ublk_stop_disk", 00:07:04.927 "ublk_start_disk", 00:07:04.927 "ublk_destroy_target", 00:07:04.927 "ublk_create_target", 00:07:04.927 "nbd_get_disks", 00:07:04.927 "nbd_stop_disk", 00:07:04.927 "nbd_start_disk", 00:07:04.927 "env_dpdk_get_mem_stats", 00:07:04.927 "nvmf_stop_mdns_prr", 00:07:04.927 "nvmf_publish_mdns_prr", 00:07:04.927 "nvmf_subsystem_get_listeners", 00:07:04.927 "nvmf_subsystem_get_qpairs", 00:07:04.927 "nvmf_subsystem_get_controllers", 00:07:04.927 "nvmf_get_stats", 00:07:04.927 "nvmf_get_transports", 00:07:04.927 "nvmf_create_transport", 00:07:04.927 "nvmf_get_targets", 00:07:04.927 "nvmf_delete_target", 00:07:04.927 "nvmf_create_target", 00:07:04.927 "nvmf_subsystem_allow_any_host", 00:07:04.927 "nvmf_subsystem_set_keys", 00:07:04.927 "nvmf_subsystem_remove_host", 00:07:04.927 "nvmf_subsystem_add_host", 00:07:04.927 "nvmf_ns_remove_host", 00:07:04.927 "nvmf_ns_add_host", 00:07:04.927 "nvmf_subsystem_remove_ns", 00:07:04.927 "nvmf_subsystem_set_ns_ana_group", 00:07:04.927 "nvmf_subsystem_add_ns", 00:07:04.927 "nvmf_subsystem_listener_set_ana_state", 00:07:04.927 "nvmf_discovery_get_referrals", 00:07:04.927 "nvmf_discovery_remove_referral", 00:07:04.927 "nvmf_discovery_add_referral", 00:07:04.927 "nvmf_subsystem_remove_listener", 00:07:04.927 "nvmf_subsystem_add_listener", 00:07:04.927 "nvmf_delete_subsystem", 00:07:04.927 "nvmf_create_subsystem", 00:07:04.927 "nvmf_get_subsystems", 00:07:04.927 "nvmf_set_crdt", 00:07:04.927 "nvmf_set_config", 00:07:04.927 "nvmf_set_max_subsystems", 00:07:04.927 "iscsi_get_histogram", 00:07:04.927 "iscsi_enable_histogram", 00:07:04.927 "iscsi_set_options", 00:07:04.927 "iscsi_get_auth_groups", 00:07:04.927 "iscsi_auth_group_remove_secret", 00:07:04.927 "iscsi_auth_group_add_secret", 00:07:04.927 "iscsi_delete_auth_group", 00:07:04.927 "iscsi_create_auth_group", 00:07:04.927 "iscsi_set_discovery_auth", 00:07:04.927 "iscsi_get_options", 00:07:04.927 "iscsi_target_node_request_logout", 00:07:04.927 "iscsi_target_node_set_redirect", 00:07:04.927 "iscsi_target_node_set_auth", 00:07:04.927 "iscsi_target_node_add_lun", 00:07:04.927 "iscsi_get_stats", 00:07:04.927 "iscsi_get_connections", 00:07:04.927 "iscsi_portal_group_set_auth", 00:07:04.927 "iscsi_start_portal_group", 00:07:04.927 "iscsi_delete_portal_group", 00:07:04.927 "iscsi_create_portal_group", 00:07:04.927 "iscsi_get_portal_groups", 00:07:04.927 "iscsi_delete_target_node", 00:07:04.927 "iscsi_target_node_remove_pg_ig_maps", 00:07:04.927 "iscsi_target_node_add_pg_ig_maps", 00:07:04.927 "iscsi_create_target_node", 00:07:04.927 "iscsi_get_target_nodes", 00:07:04.927 "iscsi_delete_initiator_group", 00:07:04.927 "iscsi_initiator_group_remove_initiators", 00:07:04.927 "iscsi_initiator_group_add_initiators", 00:07:04.927 "iscsi_create_initiator_group", 00:07:04.927 "iscsi_get_initiator_groups", 00:07:04.927 "fsdev_aio_delete", 00:07:04.927 "fsdev_aio_create", 00:07:04.927 "keyring_linux_set_options", 00:07:04.927 "keyring_file_remove_key", 00:07:04.927 "keyring_file_add_key", 00:07:04.927 "vfu_virtio_create_fs_endpoint", 00:07:04.927 "vfu_virtio_create_scsi_endpoint", 00:07:04.927 "vfu_virtio_scsi_remove_target", 00:07:04.927 "vfu_virtio_scsi_add_target", 00:07:04.927 "vfu_virtio_create_blk_endpoint", 00:07:04.927 "vfu_virtio_delete_endpoint", 00:07:04.927 "iaa_scan_accel_module", 00:07:04.927 "dsa_scan_accel_module", 00:07:04.927 "ioat_scan_accel_module", 00:07:04.927 "accel_error_inject_error", 00:07:04.927 "bdev_iscsi_delete", 00:07:04.927 "bdev_iscsi_create", 00:07:04.927 "bdev_iscsi_set_options", 00:07:04.927 "bdev_virtio_attach_controller", 00:07:04.927 "bdev_virtio_scsi_get_devices", 00:07:04.927 "bdev_virtio_detach_controller", 00:07:04.927 "bdev_virtio_blk_set_hotplug", 00:07:04.927 "bdev_ftl_set_property", 00:07:04.927 "bdev_ftl_get_properties", 00:07:04.927 "bdev_ftl_get_stats", 00:07:04.927 "bdev_ftl_unmap", 00:07:04.927 "bdev_ftl_unload", 00:07:04.927 "bdev_ftl_delete", 00:07:04.927 "bdev_ftl_load", 00:07:04.927 "bdev_ftl_create", 00:07:04.927 "bdev_aio_delete", 00:07:04.927 "bdev_aio_rescan", 00:07:04.927 "bdev_aio_create", 00:07:04.927 "blobfs_create", 00:07:04.927 "blobfs_detect", 00:07:04.927 "blobfs_set_cache_size", 00:07:04.927 "bdev_zone_block_delete", 00:07:04.927 "bdev_zone_block_create", 00:07:04.927 "bdev_delay_delete", 00:07:04.927 "bdev_delay_create", 00:07:04.927 "bdev_delay_update_latency", 00:07:04.927 "bdev_split_delete", 00:07:04.927 "bdev_split_create", 00:07:04.927 "bdev_error_inject_error", 00:07:04.927 "bdev_error_delete", 00:07:04.927 "bdev_error_create", 00:07:04.927 "bdev_raid_set_options", 00:07:04.927 "bdev_raid_remove_base_bdev", 00:07:04.927 "bdev_raid_add_base_bdev", 00:07:04.927 "bdev_raid_delete", 00:07:04.927 "bdev_raid_create", 00:07:04.927 "bdev_raid_get_bdevs", 00:07:04.927 "bdev_lvol_set_parent_bdev", 00:07:04.927 "bdev_lvol_set_parent", 00:07:04.927 "bdev_lvol_check_shallow_copy", 00:07:04.927 "bdev_lvol_start_shallow_copy", 00:07:04.927 "bdev_lvol_grow_lvstore", 00:07:04.927 "bdev_lvol_get_lvols", 00:07:04.927 "bdev_lvol_get_lvstores", 00:07:04.927 "bdev_lvol_delete", 00:07:04.927 "bdev_lvol_set_read_only", 00:07:04.927 "bdev_lvol_resize", 00:07:04.927 "bdev_lvol_decouple_parent", 00:07:04.927 "bdev_lvol_inflate", 00:07:04.927 "bdev_lvol_rename", 00:07:04.927 "bdev_lvol_clone_bdev", 00:07:04.927 "bdev_lvol_clone", 00:07:04.927 "bdev_lvol_snapshot", 00:07:04.927 "bdev_lvol_create", 00:07:04.927 "bdev_lvol_delete_lvstore", 00:07:04.927 "bdev_lvol_rename_lvstore", 00:07:04.927 "bdev_lvol_create_lvstore", 00:07:04.927 "bdev_passthru_delete", 00:07:04.927 "bdev_passthru_create", 00:07:04.927 "bdev_nvme_cuse_unregister", 00:07:04.927 "bdev_nvme_cuse_register", 00:07:04.927 "bdev_opal_new_user", 00:07:04.927 "bdev_opal_set_lock_state", 00:07:04.927 "bdev_opal_delete", 00:07:04.927 "bdev_opal_get_info", 00:07:04.927 "bdev_opal_create", 00:07:04.927 "bdev_nvme_opal_revert", 00:07:04.927 "bdev_nvme_opal_init", 00:07:04.927 "bdev_nvme_send_cmd", 00:07:04.927 "bdev_nvme_set_keys", 00:07:04.927 "bdev_nvme_get_path_iostat", 00:07:04.927 "bdev_nvme_get_mdns_discovery_info", 00:07:04.927 "bdev_nvme_stop_mdns_discovery", 00:07:04.927 "bdev_nvme_start_mdns_discovery", 00:07:04.927 "bdev_nvme_set_multipath_policy", 00:07:04.927 "bdev_nvme_set_preferred_path", 00:07:04.927 "bdev_nvme_get_io_paths", 00:07:04.927 "bdev_nvme_remove_error_injection", 00:07:04.927 "bdev_nvme_add_error_injection", 00:07:04.927 "bdev_nvme_get_discovery_info", 00:07:04.927 "bdev_nvme_stop_discovery", 00:07:04.927 "bdev_nvme_start_discovery", 00:07:04.927 "bdev_nvme_get_controller_health_info", 00:07:04.927 "bdev_nvme_disable_controller", 00:07:04.927 "bdev_nvme_enable_controller", 00:07:04.927 "bdev_nvme_reset_controller", 00:07:04.927 "bdev_nvme_get_transport_statistics", 00:07:04.927 "bdev_nvme_apply_firmware", 00:07:04.927 "bdev_nvme_detach_controller", 00:07:04.927 "bdev_nvme_get_controllers", 00:07:04.927 "bdev_nvme_attach_controller", 00:07:04.927 "bdev_nvme_set_hotplug", 00:07:04.927 "bdev_nvme_set_options", 00:07:04.927 "bdev_null_resize", 00:07:04.927 "bdev_null_delete", 00:07:04.927 "bdev_null_create", 00:07:04.927 "bdev_malloc_delete", 00:07:04.927 "bdev_malloc_create" 00:07:04.927 ] 00:07:04.927 14:20:00 spdkcli_tcp -- spdkcli/tcp.sh@35 -- # timing_exit run_spdk_tgt_tcp 00:07:04.927 14:20:00 spdkcli_tcp -- common/autotest_common.sh@732 -- # xtrace_disable 00:07:04.927 14:20:00 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:07:04.927 14:20:00 spdkcli_tcp -- spdkcli/tcp.sh@37 -- # trap - SIGINT SIGTERM EXIT 00:07:04.927 14:20:00 spdkcli_tcp -- spdkcli/tcp.sh@38 -- # killprocess 312857 00:07:04.927 14:20:00 spdkcli_tcp -- common/autotest_common.sh@954 -- # '[' -z 312857 ']' 00:07:04.927 14:20:00 spdkcli_tcp -- common/autotest_common.sh@958 -- # kill -0 312857 00:07:04.927 14:20:00 spdkcli_tcp -- common/autotest_common.sh@959 -- # uname 00:07:04.927 14:20:00 spdkcli_tcp -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:04.927 14:20:00 spdkcli_tcp -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 312857 00:07:04.927 14:20:00 spdkcli_tcp -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:07:04.927 14:20:01 spdkcli_tcp -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:07:04.927 14:20:01 spdkcli_tcp -- common/autotest_common.sh@972 -- # echo 'killing process with pid 312857' 00:07:04.927 killing process with pid 312857 00:07:04.927 14:20:01 spdkcli_tcp -- common/autotest_common.sh@973 -- # kill 312857 00:07:04.927 14:20:01 spdkcli_tcp -- common/autotest_common.sh@978 -- # wait 312857 00:07:05.187 00:07:05.187 real 0m1.138s 00:07:05.187 user 0m1.889s 00:07:05.187 sys 0m0.510s 00:07:05.188 14:20:01 spdkcli_tcp -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:05.188 14:20:01 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:07:05.188 ************************************ 00:07:05.188 END TEST spdkcli_tcp 00:07:05.188 ************************************ 00:07:05.448 14:20:01 -- spdk/autotest.sh@167 -- # run_test dpdk_mem_utility /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:07:05.448 14:20:01 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:05.448 14:20:01 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:05.448 14:20:01 -- common/autotest_common.sh@10 -- # set +x 00:07:05.448 ************************************ 00:07:05.448 START TEST dpdk_mem_utility 00:07:05.448 ************************************ 00:07:05.448 14:20:01 dpdk_mem_utility -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:07:05.448 * Looking for test storage... 00:07:05.448 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility 00:07:05.448 14:20:01 dpdk_mem_utility -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:07:05.448 14:20:01 dpdk_mem_utility -- common/autotest_common.sh@1693 -- # lcov --version 00:07:05.448 14:20:01 dpdk_mem_utility -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:07:05.448 14:20:01 dpdk_mem_utility -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:07:05.448 14:20:01 dpdk_mem_utility -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:05.448 14:20:01 dpdk_mem_utility -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:05.448 14:20:01 dpdk_mem_utility -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:05.448 14:20:01 dpdk_mem_utility -- scripts/common.sh@336 -- # IFS=.-: 00:07:05.448 14:20:01 dpdk_mem_utility -- scripts/common.sh@336 -- # read -ra ver1 00:07:05.448 14:20:01 dpdk_mem_utility -- scripts/common.sh@337 -- # IFS=.-: 00:07:05.448 14:20:01 dpdk_mem_utility -- scripts/common.sh@337 -- # read -ra ver2 00:07:05.448 14:20:01 dpdk_mem_utility -- scripts/common.sh@338 -- # local 'op=<' 00:07:05.448 14:20:01 dpdk_mem_utility -- scripts/common.sh@340 -- # ver1_l=2 00:07:05.448 14:20:01 dpdk_mem_utility -- scripts/common.sh@341 -- # ver2_l=1 00:07:05.448 14:20:01 dpdk_mem_utility -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:05.448 14:20:01 dpdk_mem_utility -- scripts/common.sh@344 -- # case "$op" in 00:07:05.448 14:20:01 dpdk_mem_utility -- scripts/common.sh@345 -- # : 1 00:07:05.448 14:20:01 dpdk_mem_utility -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:05.448 14:20:01 dpdk_mem_utility -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:05.448 14:20:01 dpdk_mem_utility -- scripts/common.sh@365 -- # decimal 1 00:07:05.448 14:20:01 dpdk_mem_utility -- scripts/common.sh@353 -- # local d=1 00:07:05.448 14:20:01 dpdk_mem_utility -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:05.448 14:20:01 dpdk_mem_utility -- scripts/common.sh@355 -- # echo 1 00:07:05.448 14:20:01 dpdk_mem_utility -- scripts/common.sh@365 -- # ver1[v]=1 00:07:05.448 14:20:01 dpdk_mem_utility -- scripts/common.sh@366 -- # decimal 2 00:07:05.448 14:20:01 dpdk_mem_utility -- scripts/common.sh@353 -- # local d=2 00:07:05.448 14:20:01 dpdk_mem_utility -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:05.448 14:20:01 dpdk_mem_utility -- scripts/common.sh@355 -- # echo 2 00:07:05.448 14:20:01 dpdk_mem_utility -- scripts/common.sh@366 -- # ver2[v]=2 00:07:05.448 14:20:01 dpdk_mem_utility -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:05.448 14:20:01 dpdk_mem_utility -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:05.448 14:20:01 dpdk_mem_utility -- scripts/common.sh@368 -- # return 0 00:07:05.448 14:20:01 dpdk_mem_utility -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:05.448 14:20:01 dpdk_mem_utility -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:07:05.448 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:05.448 --rc genhtml_branch_coverage=1 00:07:05.448 --rc genhtml_function_coverage=1 00:07:05.448 --rc genhtml_legend=1 00:07:05.448 --rc geninfo_all_blocks=1 00:07:05.448 --rc geninfo_unexecuted_blocks=1 00:07:05.448 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:05.448 ' 00:07:05.448 14:20:01 dpdk_mem_utility -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:07:05.448 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:05.448 --rc genhtml_branch_coverage=1 00:07:05.449 --rc genhtml_function_coverage=1 00:07:05.449 --rc genhtml_legend=1 00:07:05.449 --rc geninfo_all_blocks=1 00:07:05.449 --rc geninfo_unexecuted_blocks=1 00:07:05.449 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:05.449 ' 00:07:05.449 14:20:01 dpdk_mem_utility -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:07:05.449 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:05.449 --rc genhtml_branch_coverage=1 00:07:05.449 --rc genhtml_function_coverage=1 00:07:05.449 --rc genhtml_legend=1 00:07:05.449 --rc geninfo_all_blocks=1 00:07:05.449 --rc geninfo_unexecuted_blocks=1 00:07:05.449 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:05.449 ' 00:07:05.449 14:20:01 dpdk_mem_utility -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:07:05.449 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:05.449 --rc genhtml_branch_coverage=1 00:07:05.449 --rc genhtml_function_coverage=1 00:07:05.449 --rc genhtml_legend=1 00:07:05.449 --rc geninfo_all_blocks=1 00:07:05.449 --rc geninfo_unexecuted_blocks=1 00:07:05.449 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:05.449 ' 00:07:05.449 14:20:01 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@10 -- # MEM_SCRIPT=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:07:05.449 14:20:01 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@13 -- # spdkpid=313074 00:07:05.449 14:20:01 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@15 -- # waitforlisten 313074 00:07:05.449 14:20:01 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:07:05.449 14:20:01 dpdk_mem_utility -- common/autotest_common.sh@835 -- # '[' -z 313074 ']' 00:07:05.449 14:20:01 dpdk_mem_utility -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:05.449 14:20:01 dpdk_mem_utility -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:05.449 14:20:01 dpdk_mem_utility -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:05.449 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:05.449 14:20:01 dpdk_mem_utility -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:05.449 14:20:01 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:07:05.709 [2024-11-18 14:20:01.593678] Starting SPDK v25.01-pre git sha1 d47eb51c9 / DPDK 23.11.0 initialization... 00:07:05.709 [2024-11-18 14:20:01.593745] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid313074 ] 00:07:05.709 [2024-11-18 14:20:01.679895] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:05.709 [2024-11-18 14:20:01.702418] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:05.969 14:20:01 dpdk_mem_utility -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:05.969 14:20:01 dpdk_mem_utility -- common/autotest_common.sh@868 -- # return 0 00:07:05.969 14:20:01 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@17 -- # trap 'killprocess $spdkpid' SIGINT SIGTERM EXIT 00:07:05.969 14:20:01 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@19 -- # rpc_cmd env_dpdk_get_mem_stats 00:07:05.969 14:20:01 dpdk_mem_utility -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:05.969 14:20:01 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:07:05.969 { 00:07:05.969 "filename": "/tmp/spdk_mem_dump.txt" 00:07:05.969 } 00:07:05.969 14:20:01 dpdk_mem_utility -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:05.969 14:20:01 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@21 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:07:05.969 DPDK memory size 810.000000 MiB in 1 heap(s) 00:07:05.969 1 heaps totaling size 810.000000 MiB 00:07:05.969 size: 810.000000 MiB heap id: 0 00:07:05.969 end heaps---------- 00:07:05.969 9 mempools totaling size 595.772034 MiB 00:07:05.969 size: 212.674988 MiB name: PDU_immediate_data_Pool 00:07:05.969 size: 158.602051 MiB name: PDU_data_out_Pool 00:07:05.969 size: 92.545471 MiB name: bdev_io_313074 00:07:05.969 size: 50.003479 MiB name: msgpool_313074 00:07:05.969 size: 36.509338 MiB name: fsdev_io_313074 00:07:05.969 size: 21.763794 MiB name: PDU_Pool 00:07:05.969 size: 19.513306 MiB name: SCSI_TASK_Pool 00:07:05.969 size: 4.133484 MiB name: evtpool_313074 00:07:05.969 size: 0.026123 MiB name: Session_Pool 00:07:05.969 end mempools------- 00:07:05.969 6 memzones totaling size 4.142822 MiB 00:07:05.969 size: 1.000366 MiB name: RG_ring_0_313074 00:07:05.969 size: 1.000366 MiB name: RG_ring_1_313074 00:07:05.969 size: 1.000366 MiB name: RG_ring_4_313074 00:07:05.969 size: 1.000366 MiB name: RG_ring_5_313074 00:07:05.969 size: 0.125366 MiB name: RG_ring_2_313074 00:07:05.969 size: 0.015991 MiB name: RG_ring_3_313074 00:07:05.969 end memzones------- 00:07:05.969 14:20:01 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@23 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py -m 0 00:07:05.969 heap id: 0 total size: 810.000000 MiB number of busy elements: 44 number of free elements: 15 00:07:05.969 list of free elements. size: 10.862488 MiB 00:07:05.969 element at address: 0x200018a00000 with size: 0.999878 MiB 00:07:05.969 element at address: 0x200018c00000 with size: 0.999878 MiB 00:07:05.969 element at address: 0x200000400000 with size: 0.998535 MiB 00:07:05.969 element at address: 0x200031800000 with size: 0.994446 MiB 00:07:05.969 element at address: 0x200008000000 with size: 0.959839 MiB 00:07:05.969 element at address: 0x200012c00000 with size: 0.954285 MiB 00:07:05.970 element at address: 0x200018e00000 with size: 0.936584 MiB 00:07:05.970 element at address: 0x200000200000 with size: 0.717346 MiB 00:07:05.970 element at address: 0x20001a600000 with size: 0.582886 MiB 00:07:05.970 element at address: 0x200000c00000 with size: 0.495422 MiB 00:07:05.970 element at address: 0x200003e00000 with size: 0.490723 MiB 00:07:05.970 element at address: 0x200019000000 with size: 0.485657 MiB 00:07:05.970 element at address: 0x200010600000 with size: 0.481934 MiB 00:07:05.970 element at address: 0x200027a00000 with size: 0.410034 MiB 00:07:05.970 element at address: 0x200000800000 with size: 0.355042 MiB 00:07:05.970 list of standard malloc elements. size: 199.218628 MiB 00:07:05.970 element at address: 0x2000081fff80 with size: 132.000122 MiB 00:07:05.970 element at address: 0x200003ffff80 with size: 64.000122 MiB 00:07:05.970 element at address: 0x200018afff80 with size: 1.000122 MiB 00:07:05.970 element at address: 0x200018cfff80 with size: 1.000122 MiB 00:07:05.970 element at address: 0x200018efff80 with size: 1.000122 MiB 00:07:05.970 element at address: 0x2000003d9f00 with size: 0.140747 MiB 00:07:05.970 element at address: 0x200018eeff00 with size: 0.062622 MiB 00:07:05.970 element at address: 0x2000003fdf80 with size: 0.007935 MiB 00:07:05.970 element at address: 0x200018eefdc0 with size: 0.000305 MiB 00:07:05.970 element at address: 0x2000002d7c40 with size: 0.000183 MiB 00:07:05.970 element at address: 0x2000003d9e40 with size: 0.000183 MiB 00:07:05.970 element at address: 0x2000004ffa00 with size: 0.000183 MiB 00:07:05.970 element at address: 0x2000004ffac0 with size: 0.000183 MiB 00:07:05.970 element at address: 0x2000004ffb80 with size: 0.000183 MiB 00:07:05.970 element at address: 0x2000004ffd80 with size: 0.000183 MiB 00:07:05.970 element at address: 0x2000004ffe40 with size: 0.000183 MiB 00:07:05.970 element at address: 0x20000085ae40 with size: 0.000183 MiB 00:07:05.970 element at address: 0x20000085b040 with size: 0.000183 MiB 00:07:05.970 element at address: 0x20000085b100 with size: 0.000183 MiB 00:07:05.970 element at address: 0x2000008db3c0 with size: 0.000183 MiB 00:07:05.970 element at address: 0x2000008db5c0 with size: 0.000183 MiB 00:07:05.970 element at address: 0x2000008df880 with size: 0.000183 MiB 00:07:05.970 element at address: 0x2000008ffb40 with size: 0.000183 MiB 00:07:05.970 element at address: 0x200000c7ed40 with size: 0.000183 MiB 00:07:05.970 element at address: 0x200000cff000 with size: 0.000183 MiB 00:07:05.970 element at address: 0x200000cff0c0 with size: 0.000183 MiB 00:07:05.970 element at address: 0x200003e7da00 with size: 0.000183 MiB 00:07:05.970 element at address: 0x200003e7dac0 with size: 0.000183 MiB 00:07:05.970 element at address: 0x200003efdd80 with size: 0.000183 MiB 00:07:05.970 element at address: 0x2000080fdd80 with size: 0.000183 MiB 00:07:05.970 element at address: 0x20001067b600 with size: 0.000183 MiB 00:07:05.970 element at address: 0x20001067b6c0 with size: 0.000183 MiB 00:07:05.970 element at address: 0x2000106fb980 with size: 0.000183 MiB 00:07:05.970 element at address: 0x200012cf44c0 with size: 0.000183 MiB 00:07:05.970 element at address: 0x200018eefc40 with size: 0.000183 MiB 00:07:05.970 element at address: 0x200018eefd00 with size: 0.000183 MiB 00:07:05.970 element at address: 0x2000190bc740 with size: 0.000183 MiB 00:07:05.970 element at address: 0x20001a695380 with size: 0.000183 MiB 00:07:05.970 element at address: 0x20001a695440 with size: 0.000183 MiB 00:07:05.970 element at address: 0x200027a68f80 with size: 0.000183 MiB 00:07:05.970 element at address: 0x200027a69040 with size: 0.000183 MiB 00:07:05.970 element at address: 0x200027a6fc40 with size: 0.000183 MiB 00:07:05.970 element at address: 0x200027a6fe40 with size: 0.000183 MiB 00:07:05.970 element at address: 0x200027a6ff00 with size: 0.000183 MiB 00:07:05.970 list of memzone associated elements. size: 599.918884 MiB 00:07:05.970 element at address: 0x20001a695500 with size: 211.416748 MiB 00:07:05.970 associated memzone info: size: 211.416626 MiB name: MP_PDU_immediate_data_Pool_0 00:07:05.970 element at address: 0x200027a6ffc0 with size: 157.562561 MiB 00:07:05.970 associated memzone info: size: 157.562439 MiB name: MP_PDU_data_out_Pool_0 00:07:05.970 element at address: 0x200012df4780 with size: 92.045044 MiB 00:07:05.970 associated memzone info: size: 92.044922 MiB name: MP_bdev_io_313074_0 00:07:05.970 element at address: 0x200000dff380 with size: 48.003052 MiB 00:07:05.970 associated memzone info: size: 48.002930 MiB name: MP_msgpool_313074_0 00:07:05.970 element at address: 0x2000107fdb80 with size: 36.008911 MiB 00:07:05.970 associated memzone info: size: 36.008789 MiB name: MP_fsdev_io_313074_0 00:07:05.970 element at address: 0x2000191be940 with size: 20.255554 MiB 00:07:05.970 associated memzone info: size: 20.255432 MiB name: MP_PDU_Pool_0 00:07:05.970 element at address: 0x2000319feb40 with size: 18.005066 MiB 00:07:05.970 associated memzone info: size: 18.004944 MiB name: MP_SCSI_TASK_Pool_0 00:07:05.970 element at address: 0x2000004fff00 with size: 3.000244 MiB 00:07:05.970 associated memzone info: size: 3.000122 MiB name: MP_evtpool_313074_0 00:07:05.970 element at address: 0x2000009ffe00 with size: 2.000488 MiB 00:07:05.970 associated memzone info: size: 2.000366 MiB name: RG_MP_msgpool_313074 00:07:05.970 element at address: 0x2000002d7d00 with size: 1.008118 MiB 00:07:05.970 associated memzone info: size: 1.007996 MiB name: MP_evtpool_313074 00:07:05.970 element at address: 0x2000106fba40 with size: 1.008118 MiB 00:07:05.970 associated memzone info: size: 1.007996 MiB name: MP_PDU_Pool 00:07:05.970 element at address: 0x2000190bc800 with size: 1.008118 MiB 00:07:05.970 associated memzone info: size: 1.007996 MiB name: MP_PDU_immediate_data_Pool 00:07:05.970 element at address: 0x2000080fde40 with size: 1.008118 MiB 00:07:05.970 associated memzone info: size: 1.007996 MiB name: MP_PDU_data_out_Pool 00:07:05.970 element at address: 0x200003efde40 with size: 1.008118 MiB 00:07:05.970 associated memzone info: size: 1.007996 MiB name: MP_SCSI_TASK_Pool 00:07:05.970 element at address: 0x200000cff180 with size: 1.000488 MiB 00:07:05.970 associated memzone info: size: 1.000366 MiB name: RG_ring_0_313074 00:07:05.970 element at address: 0x2000008ffc00 with size: 1.000488 MiB 00:07:05.970 associated memzone info: size: 1.000366 MiB name: RG_ring_1_313074 00:07:05.970 element at address: 0x200012cf4580 with size: 1.000488 MiB 00:07:05.970 associated memzone info: size: 1.000366 MiB name: RG_ring_4_313074 00:07:05.970 element at address: 0x2000318fe940 with size: 1.000488 MiB 00:07:05.970 associated memzone info: size: 1.000366 MiB name: RG_ring_5_313074 00:07:05.970 element at address: 0x20000085b1c0 with size: 0.500488 MiB 00:07:05.970 associated memzone info: size: 0.500366 MiB name: RG_MP_fsdev_io_313074 00:07:05.970 element at address: 0x200000c7ee00 with size: 0.500488 MiB 00:07:05.970 associated memzone info: size: 0.500366 MiB name: RG_MP_bdev_io_313074 00:07:05.970 element at address: 0x20001067b780 with size: 0.500488 MiB 00:07:05.970 associated memzone info: size: 0.500366 MiB name: RG_MP_PDU_Pool 00:07:05.970 element at address: 0x200003e7db80 with size: 0.500488 MiB 00:07:05.970 associated memzone info: size: 0.500366 MiB name: RG_MP_SCSI_TASK_Pool 00:07:05.970 element at address: 0x20001907c540 with size: 0.250488 MiB 00:07:05.970 associated memzone info: size: 0.250366 MiB name: RG_MP_PDU_immediate_data_Pool 00:07:05.970 element at address: 0x2000002b7a40 with size: 0.125488 MiB 00:07:05.970 associated memzone info: size: 0.125366 MiB name: RG_MP_evtpool_313074 00:07:05.970 element at address: 0x2000008df940 with size: 0.125488 MiB 00:07:05.970 associated memzone info: size: 0.125366 MiB name: RG_ring_2_313074 00:07:05.970 element at address: 0x2000080f5b80 with size: 0.031738 MiB 00:07:05.970 associated memzone info: size: 0.031616 MiB name: RG_MP_PDU_data_out_Pool 00:07:05.970 element at address: 0x200027a69100 with size: 0.023743 MiB 00:07:05.970 associated memzone info: size: 0.023621 MiB name: MP_Session_Pool_0 00:07:05.970 element at address: 0x2000008db680 with size: 0.016113 MiB 00:07:05.970 associated memzone info: size: 0.015991 MiB name: RG_ring_3_313074 00:07:05.970 element at address: 0x200027a6f240 with size: 0.002441 MiB 00:07:05.970 associated memzone info: size: 0.002319 MiB name: RG_MP_Session_Pool 00:07:05.970 element at address: 0x2000004ffc40 with size: 0.000305 MiB 00:07:05.970 associated memzone info: size: 0.000183 MiB name: MP_msgpool_313074 00:07:05.970 element at address: 0x2000008db480 with size: 0.000305 MiB 00:07:05.970 associated memzone info: size: 0.000183 MiB name: MP_fsdev_io_313074 00:07:05.970 element at address: 0x20000085af00 with size: 0.000305 MiB 00:07:05.970 associated memzone info: size: 0.000183 MiB name: MP_bdev_io_313074 00:07:05.970 element at address: 0x200027a6fd00 with size: 0.000305 MiB 00:07:05.970 associated memzone info: size: 0.000183 MiB name: MP_Session_Pool 00:07:05.970 14:20:02 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@25 -- # trap - SIGINT SIGTERM EXIT 00:07:05.970 14:20:02 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@26 -- # killprocess 313074 00:07:05.970 14:20:02 dpdk_mem_utility -- common/autotest_common.sh@954 -- # '[' -z 313074 ']' 00:07:05.970 14:20:02 dpdk_mem_utility -- common/autotest_common.sh@958 -- # kill -0 313074 00:07:05.971 14:20:02 dpdk_mem_utility -- common/autotest_common.sh@959 -- # uname 00:07:05.971 14:20:02 dpdk_mem_utility -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:05.971 14:20:02 dpdk_mem_utility -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 313074 00:07:05.971 14:20:02 dpdk_mem_utility -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:07:05.971 14:20:02 dpdk_mem_utility -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:07:05.971 14:20:02 dpdk_mem_utility -- common/autotest_common.sh@972 -- # echo 'killing process with pid 313074' 00:07:05.971 killing process with pid 313074 00:07:05.971 14:20:02 dpdk_mem_utility -- common/autotest_common.sh@973 -- # kill 313074 00:07:05.971 14:20:02 dpdk_mem_utility -- common/autotest_common.sh@978 -- # wait 313074 00:07:06.542 00:07:06.542 real 0m1.001s 00:07:06.542 user 0m0.898s 00:07:06.542 sys 0m0.463s 00:07:06.542 14:20:02 dpdk_mem_utility -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:06.542 14:20:02 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:07:06.542 ************************************ 00:07:06.542 END TEST dpdk_mem_utility 00:07:06.542 ************************************ 00:07:06.542 14:20:02 -- spdk/autotest.sh@168 -- # run_test event /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event.sh 00:07:06.542 14:20:02 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:06.542 14:20:02 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:06.542 14:20:02 -- common/autotest_common.sh@10 -- # set +x 00:07:06.542 ************************************ 00:07:06.542 START TEST event 00:07:06.542 ************************************ 00:07:06.542 14:20:02 event -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event.sh 00:07:06.542 * Looking for test storage... 00:07:06.542 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event 00:07:06.542 14:20:02 event -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:07:06.542 14:20:02 event -- common/autotest_common.sh@1693 -- # lcov --version 00:07:06.542 14:20:02 event -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:07:06.542 14:20:02 event -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:07:06.542 14:20:02 event -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:06.542 14:20:02 event -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:06.542 14:20:02 event -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:06.542 14:20:02 event -- scripts/common.sh@336 -- # IFS=.-: 00:07:06.542 14:20:02 event -- scripts/common.sh@336 -- # read -ra ver1 00:07:06.542 14:20:02 event -- scripts/common.sh@337 -- # IFS=.-: 00:07:06.542 14:20:02 event -- scripts/common.sh@337 -- # read -ra ver2 00:07:06.542 14:20:02 event -- scripts/common.sh@338 -- # local 'op=<' 00:07:06.542 14:20:02 event -- scripts/common.sh@340 -- # ver1_l=2 00:07:06.542 14:20:02 event -- scripts/common.sh@341 -- # ver2_l=1 00:07:06.542 14:20:02 event -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:06.542 14:20:02 event -- scripts/common.sh@344 -- # case "$op" in 00:07:06.542 14:20:02 event -- scripts/common.sh@345 -- # : 1 00:07:06.542 14:20:02 event -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:06.542 14:20:02 event -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:06.542 14:20:02 event -- scripts/common.sh@365 -- # decimal 1 00:07:06.542 14:20:02 event -- scripts/common.sh@353 -- # local d=1 00:07:06.542 14:20:02 event -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:06.542 14:20:02 event -- scripts/common.sh@355 -- # echo 1 00:07:06.542 14:20:02 event -- scripts/common.sh@365 -- # ver1[v]=1 00:07:06.542 14:20:02 event -- scripts/common.sh@366 -- # decimal 2 00:07:06.542 14:20:02 event -- scripts/common.sh@353 -- # local d=2 00:07:06.542 14:20:02 event -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:06.542 14:20:02 event -- scripts/common.sh@355 -- # echo 2 00:07:06.542 14:20:02 event -- scripts/common.sh@366 -- # ver2[v]=2 00:07:06.542 14:20:02 event -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:06.542 14:20:02 event -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:06.542 14:20:02 event -- scripts/common.sh@368 -- # return 0 00:07:06.542 14:20:02 event -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:06.542 14:20:02 event -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:07:06.542 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:06.542 --rc genhtml_branch_coverage=1 00:07:06.542 --rc genhtml_function_coverage=1 00:07:06.542 --rc genhtml_legend=1 00:07:06.542 --rc geninfo_all_blocks=1 00:07:06.542 --rc geninfo_unexecuted_blocks=1 00:07:06.542 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:06.542 ' 00:07:06.542 14:20:02 event -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:07:06.542 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:06.542 --rc genhtml_branch_coverage=1 00:07:06.542 --rc genhtml_function_coverage=1 00:07:06.542 --rc genhtml_legend=1 00:07:06.542 --rc geninfo_all_blocks=1 00:07:06.542 --rc geninfo_unexecuted_blocks=1 00:07:06.542 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:06.542 ' 00:07:06.542 14:20:02 event -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:07:06.542 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:06.542 --rc genhtml_branch_coverage=1 00:07:06.542 --rc genhtml_function_coverage=1 00:07:06.542 --rc genhtml_legend=1 00:07:06.542 --rc geninfo_all_blocks=1 00:07:06.542 --rc geninfo_unexecuted_blocks=1 00:07:06.542 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:06.542 ' 00:07:06.542 14:20:02 event -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:07:06.542 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:06.542 --rc genhtml_branch_coverage=1 00:07:06.542 --rc genhtml_function_coverage=1 00:07:06.542 --rc genhtml_legend=1 00:07:06.542 --rc geninfo_all_blocks=1 00:07:06.542 --rc geninfo_unexecuted_blocks=1 00:07:06.543 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:06.543 ' 00:07:06.543 14:20:02 event -- event/event.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/bdev/nbd_common.sh 00:07:06.543 14:20:02 event -- bdev/nbd_common.sh@6 -- # set -e 00:07:06.543 14:20:02 event -- event/event.sh@45 -- # run_test event_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:07:06.543 14:20:02 event -- common/autotest_common.sh@1105 -- # '[' 6 -le 1 ']' 00:07:06.543 14:20:02 event -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:06.543 14:20:02 event -- common/autotest_common.sh@10 -- # set +x 00:07:06.802 ************************************ 00:07:06.802 START TEST event_perf 00:07:06.802 ************************************ 00:07:06.802 14:20:02 event.event_perf -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:07:06.802 Running I/O for 1 seconds...[2024-11-18 14:20:02.705145] Starting SPDK v25.01-pre git sha1 d47eb51c9 / DPDK 23.11.0 initialization... 00:07:06.802 [2024-11-18 14:20:02.705228] [ DPDK EAL parameters: event_perf --no-shconf -c 0xF --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid313276 ] 00:07:06.802 [2024-11-18 14:20:02.791324] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:06.802 [2024-11-18 14:20:02.817108] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 1 00:07:06.802 [2024-11-18 14:20:02.817217] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 2 00:07:06.802 [2024-11-18 14:20:02.817324] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:06.802 [2024-11-18 14:20:02.817326] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 3 00:07:07.741 Running I/O for 1 seconds... 00:07:07.741 lcore 0: 190538 00:07:07.741 lcore 1: 190538 00:07:07.741 lcore 2: 190538 00:07:07.741 lcore 3: 190535 00:07:07.741 done. 00:07:07.741 00:07:07.741 real 0m1.157s 00:07:07.741 user 0m4.060s 00:07:07.741 sys 0m0.094s 00:07:07.741 14:20:03 event.event_perf -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:07.741 14:20:03 event.event_perf -- common/autotest_common.sh@10 -- # set +x 00:07:07.741 ************************************ 00:07:07.741 END TEST event_perf 00:07:07.741 ************************************ 00:07:08.001 14:20:03 event -- event/event.sh@46 -- # run_test event_reactor /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:07:08.001 14:20:03 event -- common/autotest_common.sh@1105 -- # '[' 4 -le 1 ']' 00:07:08.001 14:20:03 event -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:08.001 14:20:03 event -- common/autotest_common.sh@10 -- # set +x 00:07:08.001 ************************************ 00:07:08.001 START TEST event_reactor 00:07:08.001 ************************************ 00:07:08.001 14:20:03 event.event_reactor -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:07:08.001 [2024-11-18 14:20:03.950783] Starting SPDK v25.01-pre git sha1 d47eb51c9 / DPDK 23.11.0 initialization... 00:07:08.001 [2024-11-18 14:20:03.950862] [ DPDK EAL parameters: reactor --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid313562 ] 00:07:08.001 [2024-11-18 14:20:04.041804] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:08.001 [2024-11-18 14:20:04.064028] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:09.382 test_start 00:07:09.382 oneshot 00:07:09.382 tick 100 00:07:09.382 tick 100 00:07:09.382 tick 250 00:07:09.382 tick 100 00:07:09.382 tick 100 00:07:09.382 tick 100 00:07:09.382 tick 250 00:07:09.382 tick 500 00:07:09.382 tick 100 00:07:09.382 tick 100 00:07:09.382 tick 250 00:07:09.382 tick 100 00:07:09.382 tick 100 00:07:09.382 test_end 00:07:09.382 00:07:09.382 real 0m1.165s 00:07:09.382 user 0m1.064s 00:07:09.382 sys 0m0.097s 00:07:09.382 14:20:05 event.event_reactor -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:09.382 14:20:05 event.event_reactor -- common/autotest_common.sh@10 -- # set +x 00:07:09.382 ************************************ 00:07:09.382 END TEST event_reactor 00:07:09.382 ************************************ 00:07:09.382 14:20:05 event -- event/event.sh@47 -- # run_test event_reactor_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:07:09.382 14:20:05 event -- common/autotest_common.sh@1105 -- # '[' 4 -le 1 ']' 00:07:09.382 14:20:05 event -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:09.382 14:20:05 event -- common/autotest_common.sh@10 -- # set +x 00:07:09.382 ************************************ 00:07:09.382 START TEST event_reactor_perf 00:07:09.382 ************************************ 00:07:09.382 14:20:05 event.event_reactor_perf -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:07:09.382 [2024-11-18 14:20:05.197004] Starting SPDK v25.01-pre git sha1 d47eb51c9 / DPDK 23.11.0 initialization... 00:07:09.382 [2024-11-18 14:20:05.197085] [ DPDK EAL parameters: reactor_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid313842 ] 00:07:09.382 [2024-11-18 14:20:05.286759] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:09.382 [2024-11-18 14:20:05.309632] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:10.322 test_start 00:07:10.322 test_end 00:07:10.322 Performance: 994675 events per second 00:07:10.322 00:07:10.322 real 0m1.159s 00:07:10.322 user 0m1.066s 00:07:10.322 sys 0m0.089s 00:07:10.322 14:20:06 event.event_reactor_perf -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:10.322 14:20:06 event.event_reactor_perf -- common/autotest_common.sh@10 -- # set +x 00:07:10.322 ************************************ 00:07:10.322 END TEST event_reactor_perf 00:07:10.322 ************************************ 00:07:10.322 14:20:06 event -- event/event.sh@49 -- # uname -s 00:07:10.322 14:20:06 event -- event/event.sh@49 -- # '[' Linux = Linux ']' 00:07:10.322 14:20:06 event -- event/event.sh@50 -- # run_test event_scheduler /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:07:10.322 14:20:06 event -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:10.322 14:20:06 event -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:10.322 14:20:06 event -- common/autotest_common.sh@10 -- # set +x 00:07:10.322 ************************************ 00:07:10.322 START TEST event_scheduler 00:07:10.322 ************************************ 00:07:10.322 14:20:06 event.event_scheduler -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:07:10.582 * Looking for test storage... 00:07:10.582 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler 00:07:10.582 14:20:06 event.event_scheduler -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:07:10.582 14:20:06 event.event_scheduler -- common/autotest_common.sh@1693 -- # lcov --version 00:07:10.582 14:20:06 event.event_scheduler -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:07:10.582 14:20:06 event.event_scheduler -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:07:10.582 14:20:06 event.event_scheduler -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:10.582 14:20:06 event.event_scheduler -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:10.582 14:20:06 event.event_scheduler -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:10.582 14:20:06 event.event_scheduler -- scripts/common.sh@336 -- # IFS=.-: 00:07:10.582 14:20:06 event.event_scheduler -- scripts/common.sh@336 -- # read -ra ver1 00:07:10.582 14:20:06 event.event_scheduler -- scripts/common.sh@337 -- # IFS=.-: 00:07:10.582 14:20:06 event.event_scheduler -- scripts/common.sh@337 -- # read -ra ver2 00:07:10.582 14:20:06 event.event_scheduler -- scripts/common.sh@338 -- # local 'op=<' 00:07:10.582 14:20:06 event.event_scheduler -- scripts/common.sh@340 -- # ver1_l=2 00:07:10.582 14:20:06 event.event_scheduler -- scripts/common.sh@341 -- # ver2_l=1 00:07:10.582 14:20:06 event.event_scheduler -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:10.582 14:20:06 event.event_scheduler -- scripts/common.sh@344 -- # case "$op" in 00:07:10.582 14:20:06 event.event_scheduler -- scripts/common.sh@345 -- # : 1 00:07:10.582 14:20:06 event.event_scheduler -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:10.582 14:20:06 event.event_scheduler -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:10.582 14:20:06 event.event_scheduler -- scripts/common.sh@365 -- # decimal 1 00:07:10.582 14:20:06 event.event_scheduler -- scripts/common.sh@353 -- # local d=1 00:07:10.582 14:20:06 event.event_scheduler -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:10.582 14:20:06 event.event_scheduler -- scripts/common.sh@355 -- # echo 1 00:07:10.582 14:20:06 event.event_scheduler -- scripts/common.sh@365 -- # ver1[v]=1 00:07:10.582 14:20:06 event.event_scheduler -- scripts/common.sh@366 -- # decimal 2 00:07:10.582 14:20:06 event.event_scheduler -- scripts/common.sh@353 -- # local d=2 00:07:10.583 14:20:06 event.event_scheduler -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:10.583 14:20:06 event.event_scheduler -- scripts/common.sh@355 -- # echo 2 00:07:10.583 14:20:06 event.event_scheduler -- scripts/common.sh@366 -- # ver2[v]=2 00:07:10.583 14:20:06 event.event_scheduler -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:10.583 14:20:06 event.event_scheduler -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:10.583 14:20:06 event.event_scheduler -- scripts/common.sh@368 -- # return 0 00:07:10.583 14:20:06 event.event_scheduler -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:10.583 14:20:06 event.event_scheduler -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:07:10.583 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:10.583 --rc genhtml_branch_coverage=1 00:07:10.583 --rc genhtml_function_coverage=1 00:07:10.583 --rc genhtml_legend=1 00:07:10.583 --rc geninfo_all_blocks=1 00:07:10.583 --rc geninfo_unexecuted_blocks=1 00:07:10.583 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:10.583 ' 00:07:10.583 14:20:06 event.event_scheduler -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:07:10.583 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:10.583 --rc genhtml_branch_coverage=1 00:07:10.583 --rc genhtml_function_coverage=1 00:07:10.583 --rc genhtml_legend=1 00:07:10.583 --rc geninfo_all_blocks=1 00:07:10.583 --rc geninfo_unexecuted_blocks=1 00:07:10.583 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:10.583 ' 00:07:10.583 14:20:06 event.event_scheduler -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:07:10.583 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:10.583 --rc genhtml_branch_coverage=1 00:07:10.583 --rc genhtml_function_coverage=1 00:07:10.583 --rc genhtml_legend=1 00:07:10.583 --rc geninfo_all_blocks=1 00:07:10.583 --rc geninfo_unexecuted_blocks=1 00:07:10.583 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:10.583 ' 00:07:10.583 14:20:06 event.event_scheduler -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:07:10.583 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:10.583 --rc genhtml_branch_coverage=1 00:07:10.583 --rc genhtml_function_coverage=1 00:07:10.583 --rc genhtml_legend=1 00:07:10.583 --rc geninfo_all_blocks=1 00:07:10.583 --rc geninfo_unexecuted_blocks=1 00:07:10.583 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:10.583 ' 00:07:10.583 14:20:06 event.event_scheduler -- scheduler/scheduler.sh@29 -- # rpc=rpc_cmd 00:07:10.583 14:20:06 event.event_scheduler -- scheduler/scheduler.sh@35 -- # scheduler_pid=314164 00:07:10.583 14:20:06 event.event_scheduler -- scheduler/scheduler.sh@36 -- # trap 'killprocess $scheduler_pid; exit 1' SIGINT SIGTERM EXIT 00:07:10.583 14:20:06 event.event_scheduler -- scheduler/scheduler.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler -m 0xF -p 0x2 --wait-for-rpc -f 00:07:10.583 14:20:06 event.event_scheduler -- scheduler/scheduler.sh@37 -- # waitforlisten 314164 00:07:10.583 14:20:06 event.event_scheduler -- common/autotest_common.sh@835 -- # '[' -z 314164 ']' 00:07:10.583 14:20:06 event.event_scheduler -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:10.583 14:20:06 event.event_scheduler -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:10.583 14:20:06 event.event_scheduler -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:10.583 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:10.583 14:20:06 event.event_scheduler -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:10.583 14:20:06 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:07:10.583 [2024-11-18 14:20:06.651357] Starting SPDK v25.01-pre git sha1 d47eb51c9 / DPDK 23.11.0 initialization... 00:07:10.583 [2024-11-18 14:20:06.651427] [ DPDK EAL parameters: scheduler --no-shconf -c 0xF --main-lcore=2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid314164 ] 00:07:10.842 [2024-11-18 14:20:06.737931] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:10.842 [2024-11-18 14:20:06.763435] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:10.842 [2024-11-18 14:20:06.763543] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 1 00:07:10.842 [2024-11-18 14:20:06.763655] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 2 00:07:10.842 [2024-11-18 14:20:06.763655] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 3 00:07:10.842 14:20:06 event.event_scheduler -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:10.842 14:20:06 event.event_scheduler -- common/autotest_common.sh@868 -- # return 0 00:07:10.842 14:20:06 event.event_scheduler -- scheduler/scheduler.sh@39 -- # rpc_cmd framework_set_scheduler dynamic 00:07:10.842 14:20:06 event.event_scheduler -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:10.842 14:20:06 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:07:10.842 [2024-11-18 14:20:06.836434] dpdk_governor.c: 173:_init: *ERROR*: App core mask contains some but not all of a set of SMT siblings 00:07:10.842 [2024-11-18 14:20:06.836456] scheduler_dynamic.c: 280:init: *NOTICE*: Unable to initialize dpdk governor 00:07:10.842 [2024-11-18 14:20:06.836467] scheduler_dynamic.c: 427:set_opts: *NOTICE*: Setting scheduler load limit to 20 00:07:10.842 [2024-11-18 14:20:06.836475] scheduler_dynamic.c: 429:set_opts: *NOTICE*: Setting scheduler core limit to 80 00:07:10.842 [2024-11-18 14:20:06.836482] scheduler_dynamic.c: 431:set_opts: *NOTICE*: Setting scheduler core busy to 95 00:07:10.842 14:20:06 event.event_scheduler -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:10.842 14:20:06 event.event_scheduler -- scheduler/scheduler.sh@40 -- # rpc_cmd framework_start_init 00:07:10.842 14:20:06 event.event_scheduler -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:10.842 14:20:06 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:07:10.842 [2024-11-18 14:20:06.904172] scheduler.c: 382:test_start: *NOTICE*: Scheduler test application started. 00:07:10.842 14:20:06 event.event_scheduler -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:10.842 14:20:06 event.event_scheduler -- scheduler/scheduler.sh@43 -- # run_test scheduler_create_thread scheduler_create_thread 00:07:10.842 14:20:06 event.event_scheduler -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:10.842 14:20:06 event.event_scheduler -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:10.842 14:20:06 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:07:10.842 ************************************ 00:07:10.842 START TEST scheduler_create_thread 00:07:10.842 ************************************ 00:07:10.842 14:20:06 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@1129 -- # scheduler_create_thread 00:07:10.842 14:20:06 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@12 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x1 -a 100 00:07:10.842 14:20:06 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:10.843 14:20:06 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:10.843 2 00:07:10.843 14:20:06 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:10.843 14:20:06 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@13 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x2 -a 100 00:07:10.843 14:20:06 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:10.843 14:20:06 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:10.843 3 00:07:10.843 14:20:06 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:11.101 14:20:06 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@14 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x4 -a 100 00:07:11.101 14:20:06 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:11.101 14:20:06 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:11.101 4 00:07:11.101 14:20:06 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:11.101 14:20:06 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@15 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x8 -a 100 00:07:11.101 14:20:06 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:11.101 14:20:06 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:11.101 5 00:07:11.101 14:20:06 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:11.101 14:20:06 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@16 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x1 -a 0 00:07:11.101 14:20:06 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:11.101 14:20:06 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:11.101 6 00:07:11.101 14:20:06 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:11.101 14:20:06 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@17 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x2 -a 0 00:07:11.101 14:20:06 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:11.101 14:20:06 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:11.101 7 00:07:11.101 14:20:07 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:11.101 14:20:07 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@18 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x4 -a 0 00:07:11.101 14:20:07 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:11.101 14:20:07 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:11.101 8 00:07:11.101 14:20:07 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:11.101 14:20:07 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@19 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x8 -a 0 00:07:11.101 14:20:07 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:11.101 14:20:07 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:11.101 9 00:07:11.101 14:20:07 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:11.101 14:20:07 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@21 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n one_third_active -a 30 00:07:11.101 14:20:07 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:11.101 14:20:07 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:11.101 10 00:07:11.101 14:20:07 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:11.101 14:20:07 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@22 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n half_active -a 0 00:07:11.101 14:20:07 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:11.101 14:20:07 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:11.101 14:20:07 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:11.101 14:20:07 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@22 -- # thread_id=11 00:07:11.101 14:20:07 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@23 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_set_active 11 50 00:07:11.101 14:20:07 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:11.101 14:20:07 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:11.101 14:20:07 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:11.101 14:20:07 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@25 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n deleted -a 100 00:07:11.101 14:20:07 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:11.101 14:20:07 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:12.476 14:20:08 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:12.476 14:20:08 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@25 -- # thread_id=12 00:07:12.476 14:20:08 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@26 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_delete 12 00:07:12.476 14:20:08 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:12.476 14:20:08 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:13.849 14:20:09 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:13.849 00:07:13.849 real 0m2.621s 00:07:13.849 user 0m0.024s 00:07:13.849 sys 0m0.007s 00:07:13.849 14:20:09 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:13.849 14:20:09 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:13.849 ************************************ 00:07:13.849 END TEST scheduler_create_thread 00:07:13.849 ************************************ 00:07:13.849 14:20:09 event.event_scheduler -- scheduler/scheduler.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:07:13.849 14:20:09 event.event_scheduler -- scheduler/scheduler.sh@46 -- # killprocess 314164 00:07:13.849 14:20:09 event.event_scheduler -- common/autotest_common.sh@954 -- # '[' -z 314164 ']' 00:07:13.849 14:20:09 event.event_scheduler -- common/autotest_common.sh@958 -- # kill -0 314164 00:07:13.849 14:20:09 event.event_scheduler -- common/autotest_common.sh@959 -- # uname 00:07:13.849 14:20:09 event.event_scheduler -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:13.849 14:20:09 event.event_scheduler -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 314164 00:07:13.849 14:20:09 event.event_scheduler -- common/autotest_common.sh@960 -- # process_name=reactor_2 00:07:13.849 14:20:09 event.event_scheduler -- common/autotest_common.sh@964 -- # '[' reactor_2 = sudo ']' 00:07:13.849 14:20:09 event.event_scheduler -- common/autotest_common.sh@972 -- # echo 'killing process with pid 314164' 00:07:13.849 killing process with pid 314164 00:07:13.849 14:20:09 event.event_scheduler -- common/autotest_common.sh@973 -- # kill 314164 00:07:13.849 14:20:09 event.event_scheduler -- common/autotest_common.sh@978 -- # wait 314164 00:07:14.109 [2024-11-18 14:20:10.046426] scheduler.c: 360:test_shutdown: *NOTICE*: Scheduler test application stopped. 00:07:14.109 00:07:14.109 real 0m3.775s 00:07:14.109 user 0m5.689s 00:07:14.109 sys 0m0.449s 00:07:14.109 14:20:10 event.event_scheduler -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:14.109 14:20:10 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:07:14.109 ************************************ 00:07:14.109 END TEST event_scheduler 00:07:14.109 ************************************ 00:07:14.369 14:20:10 event -- event/event.sh@51 -- # modprobe -n nbd 00:07:14.369 14:20:10 event -- event/event.sh@52 -- # run_test app_repeat app_repeat_test 00:07:14.369 14:20:10 event -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:14.369 14:20:10 event -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:14.369 14:20:10 event -- common/autotest_common.sh@10 -- # set +x 00:07:14.369 ************************************ 00:07:14.369 START TEST app_repeat 00:07:14.369 ************************************ 00:07:14.369 14:20:10 event.app_repeat -- common/autotest_common.sh@1129 -- # app_repeat_test 00:07:14.369 14:20:10 event.app_repeat -- event/event.sh@12 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:14.369 14:20:10 event.app_repeat -- event/event.sh@13 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:14.369 14:20:10 event.app_repeat -- event/event.sh@13 -- # local nbd_list 00:07:14.369 14:20:10 event.app_repeat -- event/event.sh@14 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:14.369 14:20:10 event.app_repeat -- event/event.sh@14 -- # local bdev_list 00:07:14.369 14:20:10 event.app_repeat -- event/event.sh@15 -- # local repeat_times=4 00:07:14.369 14:20:10 event.app_repeat -- event/event.sh@17 -- # modprobe nbd 00:07:14.369 14:20:10 event.app_repeat -- event/event.sh@19 -- # repeat_pid=314752 00:07:14.369 14:20:10 event.app_repeat -- event/event.sh@20 -- # trap 'killprocess $repeat_pid; exit 1' SIGINT SIGTERM EXIT 00:07:14.369 14:20:10 event.app_repeat -- event/event.sh@18 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/app_repeat/app_repeat -r /var/tmp/spdk-nbd.sock -m 0x3 -t 4 00:07:14.369 14:20:10 event.app_repeat -- event/event.sh@21 -- # echo 'Process app_repeat pid: 314752' 00:07:14.369 Process app_repeat pid: 314752 00:07:14.369 14:20:10 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:07:14.369 14:20:10 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 0' 00:07:14.369 spdk_app_start Round 0 00:07:14.369 14:20:10 event.app_repeat -- event/event.sh@25 -- # waitforlisten 314752 /var/tmp/spdk-nbd.sock 00:07:14.369 14:20:10 event.app_repeat -- common/autotest_common.sh@835 -- # '[' -z 314752 ']' 00:07:14.369 14:20:10 event.app_repeat -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:14.369 14:20:10 event.app_repeat -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:14.369 14:20:10 event.app_repeat -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:14.369 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:14.369 14:20:10 event.app_repeat -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:14.369 14:20:10 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:14.369 [2024-11-18 14:20:10.329823] Starting SPDK v25.01-pre git sha1 d47eb51c9 / DPDK 23.11.0 initialization... 00:07:14.369 [2024-11-18 14:20:10.329909] [ DPDK EAL parameters: app_repeat --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid314752 ] 00:07:14.369 [2024-11-18 14:20:10.419086] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:14.369 [2024-11-18 14:20:10.443320] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:14.369 [2024-11-18 14:20:10.443320] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 1 00:07:14.628 14:20:10 event.app_repeat -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:14.628 14:20:10 event.app_repeat -- common/autotest_common.sh@868 -- # return 0 00:07:14.629 14:20:10 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:14.629 Malloc0 00:07:14.629 14:20:10 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:14.888 Malloc1 00:07:14.888 14:20:10 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:14.888 14:20:10 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:14.888 14:20:10 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:14.888 14:20:10 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:07:14.888 14:20:10 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:14.888 14:20:10 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:07:14.888 14:20:10 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:14.888 14:20:10 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:14.888 14:20:10 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:14.888 14:20:10 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:07:14.888 14:20:10 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:14.888 14:20:10 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:07:14.888 14:20:10 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:07:14.888 14:20:10 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:07:14.888 14:20:10 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:14.888 14:20:10 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:07:15.148 /dev/nbd0 00:07:15.148 14:20:11 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:07:15.148 14:20:11 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:07:15.148 14:20:11 event.app_repeat -- common/autotest_common.sh@872 -- # local nbd_name=nbd0 00:07:15.148 14:20:11 event.app_repeat -- common/autotest_common.sh@873 -- # local i 00:07:15.148 14:20:11 event.app_repeat -- common/autotest_common.sh@875 -- # (( i = 1 )) 00:07:15.148 14:20:11 event.app_repeat -- common/autotest_common.sh@875 -- # (( i <= 20 )) 00:07:15.148 14:20:11 event.app_repeat -- common/autotest_common.sh@876 -- # grep -q -w nbd0 /proc/partitions 00:07:15.148 14:20:11 event.app_repeat -- common/autotest_common.sh@877 -- # break 00:07:15.148 14:20:11 event.app_repeat -- common/autotest_common.sh@888 -- # (( i = 1 )) 00:07:15.148 14:20:11 event.app_repeat -- common/autotest_common.sh@888 -- # (( i <= 20 )) 00:07:15.148 14:20:11 event.app_repeat -- common/autotest_common.sh@889 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:15.148 1+0 records in 00:07:15.148 1+0 records out 00:07:15.148 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00025708 s, 15.9 MB/s 00:07:15.148 14:20:11 event.app_repeat -- common/autotest_common.sh@890 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:15.148 14:20:11 event.app_repeat -- common/autotest_common.sh@890 -- # size=4096 00:07:15.148 14:20:11 event.app_repeat -- common/autotest_common.sh@891 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:15.148 14:20:11 event.app_repeat -- common/autotest_common.sh@892 -- # '[' 4096 '!=' 0 ']' 00:07:15.148 14:20:11 event.app_repeat -- common/autotest_common.sh@893 -- # return 0 00:07:15.148 14:20:11 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:15.148 14:20:11 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:15.148 14:20:11 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:07:15.408 /dev/nbd1 00:07:15.408 14:20:11 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:07:15.408 14:20:11 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:07:15.408 14:20:11 event.app_repeat -- common/autotest_common.sh@872 -- # local nbd_name=nbd1 00:07:15.408 14:20:11 event.app_repeat -- common/autotest_common.sh@873 -- # local i 00:07:15.408 14:20:11 event.app_repeat -- common/autotest_common.sh@875 -- # (( i = 1 )) 00:07:15.408 14:20:11 event.app_repeat -- common/autotest_common.sh@875 -- # (( i <= 20 )) 00:07:15.408 14:20:11 event.app_repeat -- common/autotest_common.sh@876 -- # grep -q -w nbd1 /proc/partitions 00:07:15.408 14:20:11 event.app_repeat -- common/autotest_common.sh@877 -- # break 00:07:15.408 14:20:11 event.app_repeat -- common/autotest_common.sh@888 -- # (( i = 1 )) 00:07:15.408 14:20:11 event.app_repeat -- common/autotest_common.sh@888 -- # (( i <= 20 )) 00:07:15.408 14:20:11 event.app_repeat -- common/autotest_common.sh@889 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:15.408 1+0 records in 00:07:15.408 1+0 records out 00:07:15.408 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000211304 s, 19.4 MB/s 00:07:15.408 14:20:11 event.app_repeat -- common/autotest_common.sh@890 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:15.408 14:20:11 event.app_repeat -- common/autotest_common.sh@890 -- # size=4096 00:07:15.408 14:20:11 event.app_repeat -- common/autotest_common.sh@891 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:15.408 14:20:11 event.app_repeat -- common/autotest_common.sh@892 -- # '[' 4096 '!=' 0 ']' 00:07:15.408 14:20:11 event.app_repeat -- common/autotest_common.sh@893 -- # return 0 00:07:15.408 14:20:11 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:15.408 14:20:11 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:15.408 14:20:11 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:15.408 14:20:11 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:15.408 14:20:11 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:15.668 14:20:11 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:07:15.668 { 00:07:15.668 "nbd_device": "/dev/nbd0", 00:07:15.668 "bdev_name": "Malloc0" 00:07:15.668 }, 00:07:15.668 { 00:07:15.668 "nbd_device": "/dev/nbd1", 00:07:15.668 "bdev_name": "Malloc1" 00:07:15.668 } 00:07:15.668 ]' 00:07:15.668 14:20:11 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:07:15.668 { 00:07:15.668 "nbd_device": "/dev/nbd0", 00:07:15.668 "bdev_name": "Malloc0" 00:07:15.668 }, 00:07:15.668 { 00:07:15.668 "nbd_device": "/dev/nbd1", 00:07:15.668 "bdev_name": "Malloc1" 00:07:15.668 } 00:07:15.668 ]' 00:07:15.668 14:20:11 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:15.668 14:20:11 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:07:15.668 /dev/nbd1' 00:07:15.668 14:20:11 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:07:15.668 /dev/nbd1' 00:07:15.668 14:20:11 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:15.668 14:20:11 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:07:15.668 14:20:11 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:07:15.668 14:20:11 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:07:15.668 14:20:11 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:07:15.668 14:20:11 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:07:15.668 14:20:11 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:15.668 14:20:11 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:15.668 14:20:11 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:07:15.669 14:20:11 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:15.669 14:20:11 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:07:15.669 14:20:11 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:07:15.669 256+0 records in 00:07:15.669 256+0 records out 00:07:15.669 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.010903 s, 96.2 MB/s 00:07:15.669 14:20:11 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:15.669 14:20:11 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:07:15.669 256+0 records in 00:07:15.669 256+0 records out 00:07:15.669 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0195225 s, 53.7 MB/s 00:07:15.669 14:20:11 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:15.669 14:20:11 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:07:15.669 256+0 records in 00:07:15.669 256+0 records out 00:07:15.669 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0234921 s, 44.6 MB/s 00:07:15.669 14:20:11 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:07:15.669 14:20:11 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:15.929 14:20:11 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:15.929 14:20:11 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:07:15.929 14:20:11 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:15.929 14:20:11 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:07:15.929 14:20:11 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:07:15.929 14:20:11 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:15.929 14:20:11 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:07:15.929 14:20:11 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:15.929 14:20:11 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:07:15.929 14:20:11 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:15.929 14:20:11 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:07:15.929 14:20:11 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:15.929 14:20:11 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:15.929 14:20:11 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:07:15.929 14:20:11 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:07:15.929 14:20:11 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:15.929 14:20:11 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:07:15.929 14:20:12 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:07:15.929 14:20:12 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:07:15.929 14:20:12 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:07:15.929 14:20:12 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:15.929 14:20:12 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:15.929 14:20:12 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:07:15.929 14:20:12 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:15.929 14:20:12 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:15.929 14:20:12 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:15.929 14:20:12 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:07:16.190 14:20:12 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:07:16.190 14:20:12 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:07:16.190 14:20:12 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:07:16.190 14:20:12 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:16.190 14:20:12 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:16.190 14:20:12 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:07:16.190 14:20:12 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:16.190 14:20:12 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:16.190 14:20:12 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:16.190 14:20:12 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:16.190 14:20:12 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:16.450 14:20:12 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:07:16.451 14:20:12 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:07:16.451 14:20:12 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:16.451 14:20:12 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:07:16.451 14:20:12 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:07:16.451 14:20:12 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:16.451 14:20:12 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:07:16.451 14:20:12 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:07:16.451 14:20:12 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:07:16.451 14:20:12 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:07:16.451 14:20:12 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:07:16.451 14:20:12 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:07:16.451 14:20:12 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:07:16.710 14:20:12 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:07:16.970 [2024-11-18 14:20:12.859882] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:16.970 [2024-11-18 14:20:12.879785] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 1 00:07:16.970 [2024-11-18 14:20:12.879795] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:16.971 [2024-11-18 14:20:12.920199] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:07:16.971 [2024-11-18 14:20:12.920240] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:07:20.263 14:20:15 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:07:20.263 14:20:15 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 1' 00:07:20.263 spdk_app_start Round 1 00:07:20.263 14:20:15 event.app_repeat -- event/event.sh@25 -- # waitforlisten 314752 /var/tmp/spdk-nbd.sock 00:07:20.263 14:20:15 event.app_repeat -- common/autotest_common.sh@835 -- # '[' -z 314752 ']' 00:07:20.263 14:20:15 event.app_repeat -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:20.263 14:20:15 event.app_repeat -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:20.263 14:20:15 event.app_repeat -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:20.263 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:20.263 14:20:15 event.app_repeat -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:20.263 14:20:15 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:20.263 14:20:15 event.app_repeat -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:20.263 14:20:15 event.app_repeat -- common/autotest_common.sh@868 -- # return 0 00:07:20.263 14:20:15 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:20.263 Malloc0 00:07:20.263 14:20:16 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:20.263 Malloc1 00:07:20.263 14:20:16 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:20.263 14:20:16 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:20.263 14:20:16 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:20.263 14:20:16 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:07:20.263 14:20:16 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:20.263 14:20:16 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:07:20.263 14:20:16 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:20.263 14:20:16 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:20.263 14:20:16 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:20.263 14:20:16 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:07:20.263 14:20:16 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:20.263 14:20:16 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:07:20.263 14:20:16 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:07:20.263 14:20:16 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:07:20.263 14:20:16 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:20.263 14:20:16 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:07:20.523 /dev/nbd0 00:07:20.523 14:20:16 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:07:20.523 14:20:16 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:07:20.523 14:20:16 event.app_repeat -- common/autotest_common.sh@872 -- # local nbd_name=nbd0 00:07:20.523 14:20:16 event.app_repeat -- common/autotest_common.sh@873 -- # local i 00:07:20.523 14:20:16 event.app_repeat -- common/autotest_common.sh@875 -- # (( i = 1 )) 00:07:20.523 14:20:16 event.app_repeat -- common/autotest_common.sh@875 -- # (( i <= 20 )) 00:07:20.523 14:20:16 event.app_repeat -- common/autotest_common.sh@876 -- # grep -q -w nbd0 /proc/partitions 00:07:20.523 14:20:16 event.app_repeat -- common/autotest_common.sh@877 -- # break 00:07:20.523 14:20:16 event.app_repeat -- common/autotest_common.sh@888 -- # (( i = 1 )) 00:07:20.523 14:20:16 event.app_repeat -- common/autotest_common.sh@888 -- # (( i <= 20 )) 00:07:20.523 14:20:16 event.app_repeat -- common/autotest_common.sh@889 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:20.523 1+0 records in 00:07:20.523 1+0 records out 00:07:20.523 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000241081 s, 17.0 MB/s 00:07:20.523 14:20:16 event.app_repeat -- common/autotest_common.sh@890 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:20.523 14:20:16 event.app_repeat -- common/autotest_common.sh@890 -- # size=4096 00:07:20.523 14:20:16 event.app_repeat -- common/autotest_common.sh@891 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:20.523 14:20:16 event.app_repeat -- common/autotest_common.sh@892 -- # '[' 4096 '!=' 0 ']' 00:07:20.523 14:20:16 event.app_repeat -- common/autotest_common.sh@893 -- # return 0 00:07:20.523 14:20:16 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:20.523 14:20:16 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:20.523 14:20:16 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:07:20.783 /dev/nbd1 00:07:20.783 14:20:16 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:07:20.783 14:20:16 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:07:20.783 14:20:16 event.app_repeat -- common/autotest_common.sh@872 -- # local nbd_name=nbd1 00:07:20.783 14:20:16 event.app_repeat -- common/autotest_common.sh@873 -- # local i 00:07:20.783 14:20:16 event.app_repeat -- common/autotest_common.sh@875 -- # (( i = 1 )) 00:07:20.783 14:20:16 event.app_repeat -- common/autotest_common.sh@875 -- # (( i <= 20 )) 00:07:20.783 14:20:16 event.app_repeat -- common/autotest_common.sh@876 -- # grep -q -w nbd1 /proc/partitions 00:07:20.783 14:20:16 event.app_repeat -- common/autotest_common.sh@877 -- # break 00:07:20.783 14:20:16 event.app_repeat -- common/autotest_common.sh@888 -- # (( i = 1 )) 00:07:20.783 14:20:16 event.app_repeat -- common/autotest_common.sh@888 -- # (( i <= 20 )) 00:07:20.783 14:20:16 event.app_repeat -- common/autotest_common.sh@889 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:20.783 1+0 records in 00:07:20.783 1+0 records out 00:07:20.783 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000255669 s, 16.0 MB/s 00:07:20.783 14:20:16 event.app_repeat -- common/autotest_common.sh@890 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:20.783 14:20:16 event.app_repeat -- common/autotest_common.sh@890 -- # size=4096 00:07:20.783 14:20:16 event.app_repeat -- common/autotest_common.sh@891 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:20.783 14:20:16 event.app_repeat -- common/autotest_common.sh@892 -- # '[' 4096 '!=' 0 ']' 00:07:20.783 14:20:16 event.app_repeat -- common/autotest_common.sh@893 -- # return 0 00:07:20.783 14:20:16 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:20.783 14:20:16 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:20.783 14:20:16 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:20.783 14:20:16 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:20.783 14:20:16 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:21.043 14:20:17 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:07:21.043 { 00:07:21.043 "nbd_device": "/dev/nbd0", 00:07:21.043 "bdev_name": "Malloc0" 00:07:21.043 }, 00:07:21.043 { 00:07:21.043 "nbd_device": "/dev/nbd1", 00:07:21.043 "bdev_name": "Malloc1" 00:07:21.043 } 00:07:21.043 ]' 00:07:21.043 14:20:17 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:07:21.043 { 00:07:21.043 "nbd_device": "/dev/nbd0", 00:07:21.043 "bdev_name": "Malloc0" 00:07:21.043 }, 00:07:21.043 { 00:07:21.043 "nbd_device": "/dev/nbd1", 00:07:21.043 "bdev_name": "Malloc1" 00:07:21.043 } 00:07:21.043 ]' 00:07:21.043 14:20:17 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:21.043 14:20:17 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:07:21.043 /dev/nbd1' 00:07:21.043 14:20:17 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:07:21.043 /dev/nbd1' 00:07:21.043 14:20:17 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:21.043 14:20:17 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:07:21.043 14:20:17 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:07:21.043 14:20:17 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:07:21.043 14:20:17 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:07:21.043 14:20:17 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:07:21.043 14:20:17 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:21.043 14:20:17 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:21.043 14:20:17 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:07:21.043 14:20:17 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:21.043 14:20:17 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:07:21.044 14:20:17 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:07:21.044 256+0 records in 00:07:21.044 256+0 records out 00:07:21.044 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0116365 s, 90.1 MB/s 00:07:21.044 14:20:17 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:21.044 14:20:17 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:07:21.044 256+0 records in 00:07:21.044 256+0 records out 00:07:21.044 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0199617 s, 52.5 MB/s 00:07:21.044 14:20:17 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:21.044 14:20:17 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:07:21.303 256+0 records in 00:07:21.303 256+0 records out 00:07:21.303 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0212953 s, 49.2 MB/s 00:07:21.303 14:20:17 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:07:21.303 14:20:17 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:21.303 14:20:17 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:21.303 14:20:17 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:07:21.303 14:20:17 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:21.303 14:20:17 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:07:21.303 14:20:17 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:07:21.303 14:20:17 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:21.303 14:20:17 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:07:21.303 14:20:17 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:21.303 14:20:17 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:07:21.303 14:20:17 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:21.303 14:20:17 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:07:21.303 14:20:17 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:21.303 14:20:17 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:21.303 14:20:17 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:07:21.303 14:20:17 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:07:21.303 14:20:17 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:21.304 14:20:17 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:07:21.304 14:20:17 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:07:21.304 14:20:17 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:07:21.304 14:20:17 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:07:21.304 14:20:17 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:21.304 14:20:17 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:21.304 14:20:17 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:07:21.304 14:20:17 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:21.304 14:20:17 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:21.304 14:20:17 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:21.304 14:20:17 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:07:21.563 14:20:17 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:07:21.563 14:20:17 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:07:21.563 14:20:17 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:07:21.563 14:20:17 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:21.563 14:20:17 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:21.563 14:20:17 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:07:21.563 14:20:17 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:21.563 14:20:17 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:21.563 14:20:17 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:21.563 14:20:17 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:21.563 14:20:17 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:21.822 14:20:17 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:07:21.822 14:20:17 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:07:21.822 14:20:17 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:21.822 14:20:17 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:07:21.822 14:20:17 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:07:21.822 14:20:17 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:21.823 14:20:17 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:07:21.823 14:20:17 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:07:21.823 14:20:17 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:07:21.823 14:20:17 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:07:21.823 14:20:17 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:07:21.823 14:20:17 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:07:21.823 14:20:17 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:07:22.082 14:20:18 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:07:22.082 [2024-11-18 14:20:18.206741] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:22.343 [2024-11-18 14:20:18.226515] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:22.343 [2024-11-18 14:20:18.226515] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 1 00:07:22.343 [2024-11-18 14:20:18.267148] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:07:22.343 [2024-11-18 14:20:18.267192] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:07:25.642 14:20:21 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:07:25.642 14:20:21 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 2' 00:07:25.642 spdk_app_start Round 2 00:07:25.642 14:20:21 event.app_repeat -- event/event.sh@25 -- # waitforlisten 314752 /var/tmp/spdk-nbd.sock 00:07:25.642 14:20:21 event.app_repeat -- common/autotest_common.sh@835 -- # '[' -z 314752 ']' 00:07:25.642 14:20:21 event.app_repeat -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:25.642 14:20:21 event.app_repeat -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:25.642 14:20:21 event.app_repeat -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:25.642 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:25.642 14:20:21 event.app_repeat -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:25.642 14:20:21 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:25.642 14:20:21 event.app_repeat -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:25.642 14:20:21 event.app_repeat -- common/autotest_common.sh@868 -- # return 0 00:07:25.642 14:20:21 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:25.642 Malloc0 00:07:25.642 14:20:21 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:25.642 Malloc1 00:07:25.642 14:20:21 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:25.642 14:20:21 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:25.642 14:20:21 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:25.642 14:20:21 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:07:25.642 14:20:21 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:25.642 14:20:21 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:07:25.642 14:20:21 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:25.643 14:20:21 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:25.643 14:20:21 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:25.643 14:20:21 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:07:25.643 14:20:21 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:25.643 14:20:21 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:07:25.643 14:20:21 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:07:25.643 14:20:21 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:07:25.643 14:20:21 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:25.643 14:20:21 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:07:25.902 /dev/nbd0 00:07:25.902 14:20:21 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:07:25.902 14:20:21 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:07:25.902 14:20:21 event.app_repeat -- common/autotest_common.sh@872 -- # local nbd_name=nbd0 00:07:25.902 14:20:21 event.app_repeat -- common/autotest_common.sh@873 -- # local i 00:07:25.902 14:20:21 event.app_repeat -- common/autotest_common.sh@875 -- # (( i = 1 )) 00:07:25.902 14:20:21 event.app_repeat -- common/autotest_common.sh@875 -- # (( i <= 20 )) 00:07:25.902 14:20:21 event.app_repeat -- common/autotest_common.sh@876 -- # grep -q -w nbd0 /proc/partitions 00:07:25.902 14:20:21 event.app_repeat -- common/autotest_common.sh@877 -- # break 00:07:25.902 14:20:21 event.app_repeat -- common/autotest_common.sh@888 -- # (( i = 1 )) 00:07:25.902 14:20:21 event.app_repeat -- common/autotest_common.sh@888 -- # (( i <= 20 )) 00:07:25.902 14:20:21 event.app_repeat -- common/autotest_common.sh@889 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:25.902 1+0 records in 00:07:25.902 1+0 records out 00:07:25.902 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000174358 s, 23.5 MB/s 00:07:25.902 14:20:21 event.app_repeat -- common/autotest_common.sh@890 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:25.902 14:20:21 event.app_repeat -- common/autotest_common.sh@890 -- # size=4096 00:07:25.902 14:20:21 event.app_repeat -- common/autotest_common.sh@891 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:25.902 14:20:21 event.app_repeat -- common/autotest_common.sh@892 -- # '[' 4096 '!=' 0 ']' 00:07:25.902 14:20:21 event.app_repeat -- common/autotest_common.sh@893 -- # return 0 00:07:25.902 14:20:21 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:25.902 14:20:21 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:25.903 14:20:21 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:07:26.161 /dev/nbd1 00:07:26.161 14:20:22 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:07:26.161 14:20:22 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:07:26.161 14:20:22 event.app_repeat -- common/autotest_common.sh@872 -- # local nbd_name=nbd1 00:07:26.161 14:20:22 event.app_repeat -- common/autotest_common.sh@873 -- # local i 00:07:26.162 14:20:22 event.app_repeat -- common/autotest_common.sh@875 -- # (( i = 1 )) 00:07:26.162 14:20:22 event.app_repeat -- common/autotest_common.sh@875 -- # (( i <= 20 )) 00:07:26.162 14:20:22 event.app_repeat -- common/autotest_common.sh@876 -- # grep -q -w nbd1 /proc/partitions 00:07:26.162 14:20:22 event.app_repeat -- common/autotest_common.sh@877 -- # break 00:07:26.162 14:20:22 event.app_repeat -- common/autotest_common.sh@888 -- # (( i = 1 )) 00:07:26.162 14:20:22 event.app_repeat -- common/autotest_common.sh@888 -- # (( i <= 20 )) 00:07:26.162 14:20:22 event.app_repeat -- common/autotest_common.sh@889 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:26.162 1+0 records in 00:07:26.162 1+0 records out 00:07:26.162 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000246169 s, 16.6 MB/s 00:07:26.162 14:20:22 event.app_repeat -- common/autotest_common.sh@890 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:26.162 14:20:22 event.app_repeat -- common/autotest_common.sh@890 -- # size=4096 00:07:26.162 14:20:22 event.app_repeat -- common/autotest_common.sh@891 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:26.162 14:20:22 event.app_repeat -- common/autotest_common.sh@892 -- # '[' 4096 '!=' 0 ']' 00:07:26.162 14:20:22 event.app_repeat -- common/autotest_common.sh@893 -- # return 0 00:07:26.162 14:20:22 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:26.162 14:20:22 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:26.162 14:20:22 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:26.162 14:20:22 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:26.162 14:20:22 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:26.421 14:20:22 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:07:26.421 { 00:07:26.421 "nbd_device": "/dev/nbd0", 00:07:26.421 "bdev_name": "Malloc0" 00:07:26.421 }, 00:07:26.421 { 00:07:26.421 "nbd_device": "/dev/nbd1", 00:07:26.421 "bdev_name": "Malloc1" 00:07:26.421 } 00:07:26.421 ]' 00:07:26.421 14:20:22 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:07:26.421 { 00:07:26.421 "nbd_device": "/dev/nbd0", 00:07:26.421 "bdev_name": "Malloc0" 00:07:26.421 }, 00:07:26.421 { 00:07:26.421 "nbd_device": "/dev/nbd1", 00:07:26.421 "bdev_name": "Malloc1" 00:07:26.421 } 00:07:26.421 ]' 00:07:26.421 14:20:22 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:26.421 14:20:22 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:07:26.421 /dev/nbd1' 00:07:26.421 14:20:22 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:07:26.421 /dev/nbd1' 00:07:26.421 14:20:22 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:26.421 14:20:22 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:07:26.421 14:20:22 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:07:26.421 14:20:22 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:07:26.421 14:20:22 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:07:26.421 14:20:22 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:07:26.421 14:20:22 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:26.421 14:20:22 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:26.421 14:20:22 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:07:26.421 14:20:22 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:26.421 14:20:22 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:07:26.421 14:20:22 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:07:26.421 256+0 records in 00:07:26.421 256+0 records out 00:07:26.421 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0116558 s, 90.0 MB/s 00:07:26.421 14:20:22 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:26.421 14:20:22 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:07:26.421 256+0 records in 00:07:26.421 256+0 records out 00:07:26.421 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.019833 s, 52.9 MB/s 00:07:26.421 14:20:22 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:26.421 14:20:22 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:07:26.421 256+0 records in 00:07:26.421 256+0 records out 00:07:26.421 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0235081 s, 44.6 MB/s 00:07:26.421 14:20:22 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:07:26.421 14:20:22 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:26.681 14:20:22 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:26.681 14:20:22 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:07:26.681 14:20:22 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:26.681 14:20:22 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:07:26.681 14:20:22 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:07:26.681 14:20:22 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:26.681 14:20:22 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:07:26.681 14:20:22 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:26.681 14:20:22 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:07:26.681 14:20:22 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:26.681 14:20:22 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:07:26.681 14:20:22 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:26.681 14:20:22 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:26.681 14:20:22 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:07:26.681 14:20:22 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:07:26.681 14:20:22 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:26.681 14:20:22 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:07:26.681 14:20:22 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:07:26.681 14:20:22 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:07:26.681 14:20:22 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:07:26.681 14:20:22 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:26.681 14:20:22 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:26.681 14:20:22 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:07:26.681 14:20:22 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:26.681 14:20:22 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:26.681 14:20:22 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:26.681 14:20:22 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:07:26.941 14:20:22 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:07:26.941 14:20:22 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:07:26.941 14:20:22 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:07:26.941 14:20:22 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:26.941 14:20:22 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:26.941 14:20:22 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:07:26.941 14:20:22 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:26.941 14:20:22 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:26.941 14:20:23 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:26.941 14:20:23 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:26.941 14:20:23 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:27.201 14:20:23 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:07:27.201 14:20:23 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:07:27.201 14:20:23 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:27.201 14:20:23 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:07:27.201 14:20:23 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:07:27.201 14:20:23 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:27.201 14:20:23 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:07:27.201 14:20:23 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:07:27.201 14:20:23 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:07:27.201 14:20:23 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:07:27.201 14:20:23 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:07:27.201 14:20:23 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:07:27.201 14:20:23 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:07:27.461 14:20:23 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:07:27.722 [2024-11-18 14:20:23.609870] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:27.722 [2024-11-18 14:20:23.629803] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:27.722 [2024-11-18 14:20:23.629803] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 1 00:07:27.722 [2024-11-18 14:20:23.669888] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:07:27.722 [2024-11-18 14:20:23.669944] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:07:31.014 14:20:26 event.app_repeat -- event/event.sh@38 -- # waitforlisten 314752 /var/tmp/spdk-nbd.sock 00:07:31.014 14:20:26 event.app_repeat -- common/autotest_common.sh@835 -- # '[' -z 314752 ']' 00:07:31.014 14:20:26 event.app_repeat -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:31.014 14:20:26 event.app_repeat -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:31.014 14:20:26 event.app_repeat -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:31.014 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:31.014 14:20:26 event.app_repeat -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:31.014 14:20:26 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:31.014 14:20:26 event.app_repeat -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:31.014 14:20:26 event.app_repeat -- common/autotest_common.sh@868 -- # return 0 00:07:31.014 14:20:26 event.app_repeat -- event/event.sh@39 -- # killprocess 314752 00:07:31.014 14:20:26 event.app_repeat -- common/autotest_common.sh@954 -- # '[' -z 314752 ']' 00:07:31.014 14:20:26 event.app_repeat -- common/autotest_common.sh@958 -- # kill -0 314752 00:07:31.014 14:20:26 event.app_repeat -- common/autotest_common.sh@959 -- # uname 00:07:31.014 14:20:26 event.app_repeat -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:31.014 14:20:26 event.app_repeat -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 314752 00:07:31.014 14:20:26 event.app_repeat -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:07:31.014 14:20:26 event.app_repeat -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:07:31.014 14:20:26 event.app_repeat -- common/autotest_common.sh@972 -- # echo 'killing process with pid 314752' 00:07:31.014 killing process with pid 314752 00:07:31.014 14:20:26 event.app_repeat -- common/autotest_common.sh@973 -- # kill 314752 00:07:31.014 14:20:26 event.app_repeat -- common/autotest_common.sh@978 -- # wait 314752 00:07:31.014 spdk_app_start is called in Round 0. 00:07:31.014 Shutdown signal received, stop current app iteration 00:07:31.014 Starting SPDK v25.01-pre git sha1 d47eb51c9 / DPDK 23.11.0 reinitialization... 00:07:31.014 spdk_app_start is called in Round 1. 00:07:31.014 Shutdown signal received, stop current app iteration 00:07:31.014 Starting SPDK v25.01-pre git sha1 d47eb51c9 / DPDK 23.11.0 reinitialization... 00:07:31.014 spdk_app_start is called in Round 2. 00:07:31.014 Shutdown signal received, stop current app iteration 00:07:31.014 Starting SPDK v25.01-pre git sha1 d47eb51c9 / DPDK 23.11.0 reinitialization... 00:07:31.014 spdk_app_start is called in Round 3. 00:07:31.014 Shutdown signal received, stop current app iteration 00:07:31.014 14:20:26 event.app_repeat -- event/event.sh@40 -- # trap - SIGINT SIGTERM EXIT 00:07:31.014 14:20:26 event.app_repeat -- event/event.sh@42 -- # return 0 00:07:31.014 00:07:31.014 real 0m16.548s 00:07:31.014 user 0m35.831s 00:07:31.014 sys 0m3.253s 00:07:31.014 14:20:26 event.app_repeat -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:31.014 14:20:26 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:31.014 ************************************ 00:07:31.014 END TEST app_repeat 00:07:31.014 ************************************ 00:07:31.014 14:20:26 event -- event/event.sh@54 -- # (( SPDK_TEST_CRYPTO == 0 )) 00:07:31.014 14:20:26 event -- event/event.sh@55 -- # run_test cpu_locks /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/cpu_locks.sh 00:07:31.014 14:20:26 event -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:31.014 14:20:26 event -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:31.014 14:20:26 event -- common/autotest_common.sh@10 -- # set +x 00:07:31.014 ************************************ 00:07:31.014 START TEST cpu_locks 00:07:31.014 ************************************ 00:07:31.014 14:20:26 event.cpu_locks -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/cpu_locks.sh 00:07:31.015 * Looking for test storage... 00:07:31.015 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event 00:07:31.015 14:20:27 event.cpu_locks -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:07:31.015 14:20:27 event.cpu_locks -- common/autotest_common.sh@1693 -- # lcov --version 00:07:31.015 14:20:27 event.cpu_locks -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:07:31.015 14:20:27 event.cpu_locks -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:07:31.015 14:20:27 event.cpu_locks -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:31.015 14:20:27 event.cpu_locks -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:31.015 14:20:27 event.cpu_locks -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:31.015 14:20:27 event.cpu_locks -- scripts/common.sh@336 -- # IFS=.-: 00:07:31.015 14:20:27 event.cpu_locks -- scripts/common.sh@336 -- # read -ra ver1 00:07:31.015 14:20:27 event.cpu_locks -- scripts/common.sh@337 -- # IFS=.-: 00:07:31.015 14:20:27 event.cpu_locks -- scripts/common.sh@337 -- # read -ra ver2 00:07:31.015 14:20:27 event.cpu_locks -- scripts/common.sh@338 -- # local 'op=<' 00:07:31.015 14:20:27 event.cpu_locks -- scripts/common.sh@340 -- # ver1_l=2 00:07:31.015 14:20:27 event.cpu_locks -- scripts/common.sh@341 -- # ver2_l=1 00:07:31.015 14:20:27 event.cpu_locks -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:31.015 14:20:27 event.cpu_locks -- scripts/common.sh@344 -- # case "$op" in 00:07:31.015 14:20:27 event.cpu_locks -- scripts/common.sh@345 -- # : 1 00:07:31.015 14:20:27 event.cpu_locks -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:31.015 14:20:27 event.cpu_locks -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:31.015 14:20:27 event.cpu_locks -- scripts/common.sh@365 -- # decimal 1 00:07:31.015 14:20:27 event.cpu_locks -- scripts/common.sh@353 -- # local d=1 00:07:31.015 14:20:27 event.cpu_locks -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:31.015 14:20:27 event.cpu_locks -- scripts/common.sh@355 -- # echo 1 00:07:31.015 14:20:27 event.cpu_locks -- scripts/common.sh@365 -- # ver1[v]=1 00:07:31.015 14:20:27 event.cpu_locks -- scripts/common.sh@366 -- # decimal 2 00:07:31.015 14:20:27 event.cpu_locks -- scripts/common.sh@353 -- # local d=2 00:07:31.015 14:20:27 event.cpu_locks -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:31.015 14:20:27 event.cpu_locks -- scripts/common.sh@355 -- # echo 2 00:07:31.015 14:20:27 event.cpu_locks -- scripts/common.sh@366 -- # ver2[v]=2 00:07:31.015 14:20:27 event.cpu_locks -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:31.015 14:20:27 event.cpu_locks -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:31.015 14:20:27 event.cpu_locks -- scripts/common.sh@368 -- # return 0 00:07:31.015 14:20:27 event.cpu_locks -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:31.015 14:20:27 event.cpu_locks -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:07:31.015 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:31.015 --rc genhtml_branch_coverage=1 00:07:31.015 --rc genhtml_function_coverage=1 00:07:31.015 --rc genhtml_legend=1 00:07:31.015 --rc geninfo_all_blocks=1 00:07:31.015 --rc geninfo_unexecuted_blocks=1 00:07:31.015 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:31.015 ' 00:07:31.015 14:20:27 event.cpu_locks -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:07:31.015 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:31.015 --rc genhtml_branch_coverage=1 00:07:31.015 --rc genhtml_function_coverage=1 00:07:31.015 --rc genhtml_legend=1 00:07:31.015 --rc geninfo_all_blocks=1 00:07:31.015 --rc geninfo_unexecuted_blocks=1 00:07:31.015 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:31.015 ' 00:07:31.015 14:20:27 event.cpu_locks -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:07:31.015 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:31.015 --rc genhtml_branch_coverage=1 00:07:31.015 --rc genhtml_function_coverage=1 00:07:31.015 --rc genhtml_legend=1 00:07:31.015 --rc geninfo_all_blocks=1 00:07:31.015 --rc geninfo_unexecuted_blocks=1 00:07:31.015 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:31.015 ' 00:07:31.015 14:20:27 event.cpu_locks -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:07:31.015 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:31.015 --rc genhtml_branch_coverage=1 00:07:31.015 --rc genhtml_function_coverage=1 00:07:31.015 --rc genhtml_legend=1 00:07:31.015 --rc geninfo_all_blocks=1 00:07:31.015 --rc geninfo_unexecuted_blocks=1 00:07:31.015 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:31.015 ' 00:07:31.015 14:20:27 event.cpu_locks -- event/cpu_locks.sh@11 -- # rpc_sock1=/var/tmp/spdk.sock 00:07:31.015 14:20:27 event.cpu_locks -- event/cpu_locks.sh@12 -- # rpc_sock2=/var/tmp/spdk2.sock 00:07:31.015 14:20:27 event.cpu_locks -- event/cpu_locks.sh@164 -- # trap cleanup EXIT SIGTERM SIGINT 00:07:31.015 14:20:27 event.cpu_locks -- event/cpu_locks.sh@166 -- # run_test default_locks default_locks 00:07:31.015 14:20:27 event.cpu_locks -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:31.015 14:20:27 event.cpu_locks -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:31.015 14:20:27 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:07:31.276 ************************************ 00:07:31.276 START TEST default_locks 00:07:31.276 ************************************ 00:07:31.276 14:20:27 event.cpu_locks.default_locks -- common/autotest_common.sh@1129 -- # default_locks 00:07:31.276 14:20:27 event.cpu_locks.default_locks -- event/cpu_locks.sh@46 -- # spdk_tgt_pid=317922 00:07:31.276 14:20:27 event.cpu_locks.default_locks -- event/cpu_locks.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:07:31.276 14:20:27 event.cpu_locks.default_locks -- event/cpu_locks.sh@47 -- # waitforlisten 317922 00:07:31.276 14:20:27 event.cpu_locks.default_locks -- common/autotest_common.sh@835 -- # '[' -z 317922 ']' 00:07:31.276 14:20:27 event.cpu_locks.default_locks -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:31.276 14:20:27 event.cpu_locks.default_locks -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:31.276 14:20:27 event.cpu_locks.default_locks -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:31.276 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:31.276 14:20:27 event.cpu_locks.default_locks -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:31.276 14:20:27 event.cpu_locks.default_locks -- common/autotest_common.sh@10 -- # set +x 00:07:31.276 [2024-11-18 14:20:27.193717] Starting SPDK v25.01-pre git sha1 d47eb51c9 / DPDK 23.11.0 initialization... 00:07:31.276 [2024-11-18 14:20:27.193773] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid317922 ] 00:07:31.276 [2024-11-18 14:20:27.280180] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:31.276 [2024-11-18 14:20:27.302264] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:31.537 14:20:27 event.cpu_locks.default_locks -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:31.537 14:20:27 event.cpu_locks.default_locks -- common/autotest_common.sh@868 -- # return 0 00:07:31.537 14:20:27 event.cpu_locks.default_locks -- event/cpu_locks.sh@49 -- # locks_exist 317922 00:07:31.537 14:20:27 event.cpu_locks.default_locks -- event/cpu_locks.sh@22 -- # lslocks -p 317922 00:07:31.537 14:20:27 event.cpu_locks.default_locks -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:07:31.797 lslocks: write error 00:07:31.797 14:20:27 event.cpu_locks.default_locks -- event/cpu_locks.sh@50 -- # killprocess 317922 00:07:31.797 14:20:27 event.cpu_locks.default_locks -- common/autotest_common.sh@954 -- # '[' -z 317922 ']' 00:07:31.797 14:20:27 event.cpu_locks.default_locks -- common/autotest_common.sh@958 -- # kill -0 317922 00:07:31.797 14:20:27 event.cpu_locks.default_locks -- common/autotest_common.sh@959 -- # uname 00:07:31.797 14:20:27 event.cpu_locks.default_locks -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:31.797 14:20:27 event.cpu_locks.default_locks -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 317922 00:07:31.797 14:20:27 event.cpu_locks.default_locks -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:07:31.797 14:20:27 event.cpu_locks.default_locks -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:07:31.797 14:20:27 event.cpu_locks.default_locks -- common/autotest_common.sh@972 -- # echo 'killing process with pid 317922' 00:07:31.797 killing process with pid 317922 00:07:31.797 14:20:27 event.cpu_locks.default_locks -- common/autotest_common.sh@973 -- # kill 317922 00:07:31.797 14:20:27 event.cpu_locks.default_locks -- common/autotest_common.sh@978 -- # wait 317922 00:07:32.363 14:20:28 event.cpu_locks.default_locks -- event/cpu_locks.sh@52 -- # NOT waitforlisten 317922 00:07:32.363 14:20:28 event.cpu_locks.default_locks -- common/autotest_common.sh@652 -- # local es=0 00:07:32.363 14:20:28 event.cpu_locks.default_locks -- common/autotest_common.sh@654 -- # valid_exec_arg waitforlisten 317922 00:07:32.364 14:20:28 event.cpu_locks.default_locks -- common/autotest_common.sh@640 -- # local arg=waitforlisten 00:07:32.364 14:20:28 event.cpu_locks.default_locks -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:07:32.364 14:20:28 event.cpu_locks.default_locks -- common/autotest_common.sh@644 -- # type -t waitforlisten 00:07:32.364 14:20:28 event.cpu_locks.default_locks -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:07:32.364 14:20:28 event.cpu_locks.default_locks -- common/autotest_common.sh@655 -- # waitforlisten 317922 00:07:32.364 14:20:28 event.cpu_locks.default_locks -- common/autotest_common.sh@835 -- # '[' -z 317922 ']' 00:07:32.364 14:20:28 event.cpu_locks.default_locks -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:32.364 14:20:28 event.cpu_locks.default_locks -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:32.364 14:20:28 event.cpu_locks.default_locks -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:32.364 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:32.364 14:20:28 event.cpu_locks.default_locks -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:32.364 14:20:28 event.cpu_locks.default_locks -- common/autotest_common.sh@10 -- # set +x 00:07:32.364 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 850: kill: (317922) - No such process 00:07:32.364 ERROR: process (pid: 317922) is no longer running 00:07:32.364 14:20:28 event.cpu_locks.default_locks -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:32.364 14:20:28 event.cpu_locks.default_locks -- common/autotest_common.sh@868 -- # return 1 00:07:32.364 14:20:28 event.cpu_locks.default_locks -- common/autotest_common.sh@655 -- # es=1 00:07:32.364 14:20:28 event.cpu_locks.default_locks -- common/autotest_common.sh@663 -- # (( es > 128 )) 00:07:32.364 14:20:28 event.cpu_locks.default_locks -- common/autotest_common.sh@674 -- # [[ -n '' ]] 00:07:32.364 14:20:28 event.cpu_locks.default_locks -- common/autotest_common.sh@679 -- # (( !es == 0 )) 00:07:32.364 14:20:28 event.cpu_locks.default_locks -- event/cpu_locks.sh@54 -- # no_locks 00:07:32.364 14:20:28 event.cpu_locks.default_locks -- event/cpu_locks.sh@26 -- # lock_files=() 00:07:32.364 14:20:28 event.cpu_locks.default_locks -- event/cpu_locks.sh@26 -- # local lock_files 00:07:32.364 14:20:28 event.cpu_locks.default_locks -- event/cpu_locks.sh@27 -- # (( 0 != 0 )) 00:07:32.364 00:07:32.364 real 0m1.021s 00:07:32.364 user 0m0.980s 00:07:32.364 sys 0m0.517s 00:07:32.364 14:20:28 event.cpu_locks.default_locks -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:32.364 14:20:28 event.cpu_locks.default_locks -- common/autotest_common.sh@10 -- # set +x 00:07:32.364 ************************************ 00:07:32.364 END TEST default_locks 00:07:32.364 ************************************ 00:07:32.364 14:20:28 event.cpu_locks -- event/cpu_locks.sh@167 -- # run_test default_locks_via_rpc default_locks_via_rpc 00:07:32.364 14:20:28 event.cpu_locks -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:32.364 14:20:28 event.cpu_locks -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:32.364 14:20:28 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:07:32.364 ************************************ 00:07:32.364 START TEST default_locks_via_rpc 00:07:32.364 ************************************ 00:07:32.364 14:20:28 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@1129 -- # default_locks_via_rpc 00:07:32.364 14:20:28 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@62 -- # spdk_tgt_pid=318193 00:07:32.364 14:20:28 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@63 -- # waitforlisten 318193 00:07:32.364 14:20:28 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@835 -- # '[' -z 318193 ']' 00:07:32.364 14:20:28 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:32.364 14:20:28 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@61 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:07:32.364 14:20:28 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:32.364 14:20:28 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:32.364 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:32.364 14:20:28 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:32.364 14:20:28 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:32.364 [2024-11-18 14:20:28.287683] Starting SPDK v25.01-pre git sha1 d47eb51c9 / DPDK 23.11.0 initialization... 00:07:32.364 [2024-11-18 14:20:28.287741] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid318193 ] 00:07:32.364 [2024-11-18 14:20:28.371642] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:32.364 [2024-11-18 14:20:28.393886] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:32.623 14:20:28 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:32.623 14:20:28 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@868 -- # return 0 00:07:32.623 14:20:28 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@65 -- # rpc_cmd framework_disable_cpumask_locks 00:07:32.623 14:20:28 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:32.623 14:20:28 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:32.623 14:20:28 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:32.623 14:20:28 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@67 -- # no_locks 00:07:32.623 14:20:28 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@26 -- # lock_files=() 00:07:32.623 14:20:28 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@26 -- # local lock_files 00:07:32.623 14:20:28 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@27 -- # (( 0 != 0 )) 00:07:32.623 14:20:28 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@69 -- # rpc_cmd framework_enable_cpumask_locks 00:07:32.623 14:20:28 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:32.623 14:20:28 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:32.623 14:20:28 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:32.623 14:20:28 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@71 -- # locks_exist 318193 00:07:32.623 14:20:28 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@22 -- # lslocks -p 318193 00:07:32.623 14:20:28 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:07:33.192 14:20:29 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@73 -- # killprocess 318193 00:07:33.192 14:20:29 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@954 -- # '[' -z 318193 ']' 00:07:33.192 14:20:29 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@958 -- # kill -0 318193 00:07:33.192 14:20:29 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@959 -- # uname 00:07:33.192 14:20:29 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:33.192 14:20:29 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 318193 00:07:33.192 14:20:29 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:07:33.192 14:20:29 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:07:33.192 14:20:29 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@972 -- # echo 'killing process with pid 318193' 00:07:33.192 killing process with pid 318193 00:07:33.192 14:20:29 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@973 -- # kill 318193 00:07:33.192 14:20:29 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@978 -- # wait 318193 00:07:33.451 00:07:33.451 real 0m1.115s 00:07:33.451 user 0m1.087s 00:07:33.451 sys 0m0.555s 00:07:33.451 14:20:29 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:33.451 14:20:29 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:33.451 ************************************ 00:07:33.451 END TEST default_locks_via_rpc 00:07:33.451 ************************************ 00:07:33.451 14:20:29 event.cpu_locks -- event/cpu_locks.sh@168 -- # run_test non_locking_app_on_locked_coremask non_locking_app_on_locked_coremask 00:07:33.451 14:20:29 event.cpu_locks -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:33.451 14:20:29 event.cpu_locks -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:33.451 14:20:29 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:07:33.451 ************************************ 00:07:33.451 START TEST non_locking_app_on_locked_coremask 00:07:33.451 ************************************ 00:07:33.451 14:20:29 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@1129 -- # non_locking_app_on_locked_coremask 00:07:33.451 14:20:29 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@80 -- # spdk_tgt_pid=318310 00:07:33.451 14:20:29 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@81 -- # waitforlisten 318310 /var/tmp/spdk.sock 00:07:33.452 14:20:29 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@79 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:07:33.452 14:20:29 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@835 -- # '[' -z 318310 ']' 00:07:33.452 14:20:29 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:33.452 14:20:29 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:33.452 14:20:29 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:33.452 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:33.452 14:20:29 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:33.452 14:20:29 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:33.452 [2024-11-18 14:20:29.493954] Starting SPDK v25.01-pre git sha1 d47eb51c9 / DPDK 23.11.0 initialization... 00:07:33.452 [2024-11-18 14:20:29.494016] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid318310 ] 00:07:33.452 [2024-11-18 14:20:29.575944] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:33.711 [2024-11-18 14:20:29.597797] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:33.711 14:20:29 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:33.711 14:20:29 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@868 -- # return 0 00:07:33.711 14:20:29 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@84 -- # spdk_tgt_pid2=318477 00:07:33.711 14:20:29 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@85 -- # waitforlisten 318477 /var/tmp/spdk2.sock 00:07:33.711 14:20:29 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@83 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 --disable-cpumask-locks -r /var/tmp/spdk2.sock 00:07:33.711 14:20:29 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@835 -- # '[' -z 318477 ']' 00:07:33.711 14:20:29 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk2.sock 00:07:33.711 14:20:29 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:33.711 14:20:29 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:07:33.711 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:07:33.711 14:20:29 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:33.711 14:20:29 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:33.971 [2024-11-18 14:20:29.839968] Starting SPDK v25.01-pre git sha1 d47eb51c9 / DPDK 23.11.0 initialization... 00:07:33.971 [2024-11-18 14:20:29.840045] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid318477 ] 00:07:33.971 [2024-11-18 14:20:29.937321] app.c: 916:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:07:33.971 [2024-11-18 14:20:29.937352] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:33.971 [2024-11-18 14:20:29.983149] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:34.231 14:20:30 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:34.231 14:20:30 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@868 -- # return 0 00:07:34.231 14:20:30 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@87 -- # locks_exist 318310 00:07:34.231 14:20:30 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@22 -- # lslocks -p 318310 00:07:34.231 14:20:30 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:07:34.800 lslocks: write error 00:07:34.800 14:20:30 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@89 -- # killprocess 318310 00:07:34.800 14:20:30 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@954 -- # '[' -z 318310 ']' 00:07:34.800 14:20:30 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@958 -- # kill -0 318310 00:07:34.800 14:20:30 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@959 -- # uname 00:07:34.800 14:20:30 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:34.800 14:20:30 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 318310 00:07:35.059 14:20:30 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:07:35.059 14:20:30 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:07:35.059 14:20:30 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@972 -- # echo 'killing process with pid 318310' 00:07:35.059 killing process with pid 318310 00:07:35.059 14:20:30 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@973 -- # kill 318310 00:07:35.059 14:20:30 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@978 -- # wait 318310 00:07:35.629 14:20:31 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@90 -- # killprocess 318477 00:07:35.629 14:20:31 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@954 -- # '[' -z 318477 ']' 00:07:35.629 14:20:31 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@958 -- # kill -0 318477 00:07:35.629 14:20:31 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@959 -- # uname 00:07:35.629 14:20:31 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:35.629 14:20:31 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 318477 00:07:35.629 14:20:31 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:07:35.629 14:20:31 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:07:35.629 14:20:31 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@972 -- # echo 'killing process with pid 318477' 00:07:35.629 killing process with pid 318477 00:07:35.630 14:20:31 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@973 -- # kill 318477 00:07:35.630 14:20:31 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@978 -- # wait 318477 00:07:35.890 00:07:35.890 real 0m2.387s 00:07:35.890 user 0m2.365s 00:07:35.890 sys 0m1.011s 00:07:35.890 14:20:31 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:35.890 14:20:31 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:35.890 ************************************ 00:07:35.890 END TEST non_locking_app_on_locked_coremask 00:07:35.890 ************************************ 00:07:35.890 14:20:31 event.cpu_locks -- event/cpu_locks.sh@169 -- # run_test locking_app_on_unlocked_coremask locking_app_on_unlocked_coremask 00:07:35.890 14:20:31 event.cpu_locks -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:35.890 14:20:31 event.cpu_locks -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:35.890 14:20:31 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:07:35.890 ************************************ 00:07:35.890 START TEST locking_app_on_unlocked_coremask 00:07:35.890 ************************************ 00:07:35.890 14:20:31 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@1129 -- # locking_app_on_unlocked_coremask 00:07:35.890 14:20:31 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@98 -- # spdk_tgt_pid=318813 00:07:35.890 14:20:31 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@99 -- # waitforlisten 318813 /var/tmp/spdk.sock 00:07:35.890 14:20:31 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@97 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 --disable-cpumask-locks 00:07:35.890 14:20:31 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@835 -- # '[' -z 318813 ']' 00:07:35.890 14:20:31 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:35.890 14:20:31 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:35.890 14:20:31 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:35.890 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:35.890 14:20:31 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:35.890 14:20:31 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:35.890 [2024-11-18 14:20:31.963908] Starting SPDK v25.01-pre git sha1 d47eb51c9 / DPDK 23.11.0 initialization... 00:07:35.890 [2024-11-18 14:20:31.963961] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid318813 ] 00:07:36.150 [2024-11-18 14:20:32.049365] app.c: 916:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:07:36.150 [2024-11-18 14:20:32.049389] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:36.150 [2024-11-18 14:20:32.071503] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:36.150 14:20:32 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:36.150 14:20:32 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@868 -- # return 0 00:07:36.150 14:20:32 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@102 -- # spdk_tgt_pid2=318817 00:07:36.150 14:20:32 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@103 -- # waitforlisten 318817 /var/tmp/spdk2.sock 00:07:36.150 14:20:32 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@101 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -r /var/tmp/spdk2.sock 00:07:36.150 14:20:32 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@835 -- # '[' -z 318817 ']' 00:07:36.150 14:20:32 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk2.sock 00:07:36.150 14:20:32 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:36.150 14:20:32 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:07:36.150 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:07:36.150 14:20:32 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:36.150 14:20:32 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:36.410 [2024-11-18 14:20:32.285259] Starting SPDK v25.01-pre git sha1 d47eb51c9 / DPDK 23.11.0 initialization... 00:07:36.410 [2024-11-18 14:20:32.285344] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid318817 ] 00:07:36.410 [2024-11-18 14:20:32.382933] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:36.410 [2024-11-18 14:20:32.428577] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:37.349 14:20:33 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:37.349 14:20:33 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@868 -- # return 0 00:07:37.349 14:20:33 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@105 -- # locks_exist 318817 00:07:37.349 14:20:33 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@22 -- # lslocks -p 318817 00:07:37.349 14:20:33 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:07:38.289 lslocks: write error 00:07:38.289 14:20:34 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@107 -- # killprocess 318813 00:07:38.289 14:20:34 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@954 -- # '[' -z 318813 ']' 00:07:38.289 14:20:34 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@958 -- # kill -0 318813 00:07:38.289 14:20:34 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@959 -- # uname 00:07:38.289 14:20:34 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:38.289 14:20:34 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 318813 00:07:38.289 14:20:34 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:07:38.289 14:20:34 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:07:38.289 14:20:34 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@972 -- # echo 'killing process with pid 318813' 00:07:38.289 killing process with pid 318813 00:07:38.289 14:20:34 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@973 -- # kill 318813 00:07:38.289 14:20:34 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@978 -- # wait 318813 00:07:38.858 14:20:34 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@108 -- # killprocess 318817 00:07:38.858 14:20:34 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@954 -- # '[' -z 318817 ']' 00:07:38.858 14:20:34 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@958 -- # kill -0 318817 00:07:38.858 14:20:34 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@959 -- # uname 00:07:38.858 14:20:34 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:38.858 14:20:34 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 318817 00:07:38.858 14:20:34 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:07:38.858 14:20:34 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:07:38.858 14:20:34 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@972 -- # echo 'killing process with pid 318817' 00:07:38.858 killing process with pid 318817 00:07:38.858 14:20:34 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@973 -- # kill 318817 00:07:38.858 14:20:34 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@978 -- # wait 318817 00:07:39.428 00:07:39.428 real 0m3.321s 00:07:39.428 user 0m3.498s 00:07:39.428 sys 0m1.247s 00:07:39.428 14:20:35 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:39.428 14:20:35 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:39.428 ************************************ 00:07:39.428 END TEST locking_app_on_unlocked_coremask 00:07:39.428 ************************************ 00:07:39.428 14:20:35 event.cpu_locks -- event/cpu_locks.sh@170 -- # run_test locking_app_on_locked_coremask locking_app_on_locked_coremask 00:07:39.428 14:20:35 event.cpu_locks -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:39.428 14:20:35 event.cpu_locks -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:39.428 14:20:35 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:07:39.428 ************************************ 00:07:39.428 START TEST locking_app_on_locked_coremask 00:07:39.428 ************************************ 00:07:39.428 14:20:35 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@1129 -- # locking_app_on_locked_coremask 00:07:39.428 14:20:35 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@115 -- # spdk_tgt_pid=319384 00:07:39.428 14:20:35 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@116 -- # waitforlisten 319384 /var/tmp/spdk.sock 00:07:39.428 14:20:35 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:07:39.428 14:20:35 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@835 -- # '[' -z 319384 ']' 00:07:39.428 14:20:35 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:39.428 14:20:35 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:39.428 14:20:35 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:39.428 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:39.428 14:20:35 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:39.428 14:20:35 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:39.428 [2024-11-18 14:20:35.371904] Starting SPDK v25.01-pre git sha1 d47eb51c9 / DPDK 23.11.0 initialization... 00:07:39.428 [2024-11-18 14:20:35.371986] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid319384 ] 00:07:39.428 [2024-11-18 14:20:35.448850] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:39.428 [2024-11-18 14:20:35.469399] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:39.689 14:20:35 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:39.689 14:20:35 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@868 -- # return 0 00:07:39.689 14:20:35 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@119 -- # spdk_tgt_pid2=319432 00:07:39.689 14:20:35 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@120 -- # NOT waitforlisten 319432 /var/tmp/spdk2.sock 00:07:39.689 14:20:35 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@118 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -r /var/tmp/spdk2.sock 00:07:39.689 14:20:35 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@652 -- # local es=0 00:07:39.689 14:20:35 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@654 -- # valid_exec_arg waitforlisten 319432 /var/tmp/spdk2.sock 00:07:39.689 14:20:35 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@640 -- # local arg=waitforlisten 00:07:39.689 14:20:35 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:07:39.689 14:20:35 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@644 -- # type -t waitforlisten 00:07:39.689 14:20:35 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:07:39.689 14:20:35 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@655 -- # waitforlisten 319432 /var/tmp/spdk2.sock 00:07:39.689 14:20:35 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@835 -- # '[' -z 319432 ']' 00:07:39.689 14:20:35 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk2.sock 00:07:39.689 14:20:35 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:39.689 14:20:35 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:07:39.689 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:07:39.689 14:20:35 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:39.689 14:20:35 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:39.689 [2024-11-18 14:20:35.708812] Starting SPDK v25.01-pre git sha1 d47eb51c9 / DPDK 23.11.0 initialization... 00:07:39.689 [2024-11-18 14:20:35.708878] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid319432 ] 00:07:39.689 [2024-11-18 14:20:35.807510] app.c: 782:claim_cpu_cores: *ERROR*: Cannot create lock on core 0, probably process 319384 has claimed it. 00:07:39.689 [2024-11-18 14:20:35.811557] app.c: 912:spdk_app_start: *ERROR*: Unable to acquire lock on assigned core mask - exiting. 00:07:40.258 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 850: kill: (319432) - No such process 00:07:40.258 ERROR: process (pid: 319432) is no longer running 00:07:40.258 14:20:36 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:40.258 14:20:36 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@868 -- # return 1 00:07:40.258 14:20:36 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@655 -- # es=1 00:07:40.258 14:20:36 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@663 -- # (( es > 128 )) 00:07:40.258 14:20:36 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@674 -- # [[ -n '' ]] 00:07:40.258 14:20:36 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@679 -- # (( !es == 0 )) 00:07:40.258 14:20:36 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@122 -- # locks_exist 319384 00:07:40.258 14:20:36 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@22 -- # lslocks -p 319384 00:07:40.258 14:20:36 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:07:41.200 lslocks: write error 00:07:41.200 14:20:37 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@124 -- # killprocess 319384 00:07:41.200 14:20:37 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@954 -- # '[' -z 319384 ']' 00:07:41.200 14:20:37 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@958 -- # kill -0 319384 00:07:41.200 14:20:37 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@959 -- # uname 00:07:41.200 14:20:37 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:41.200 14:20:37 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 319384 00:07:41.200 14:20:37 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:07:41.200 14:20:37 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:07:41.200 14:20:37 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@972 -- # echo 'killing process with pid 319384' 00:07:41.200 killing process with pid 319384 00:07:41.200 14:20:37 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@973 -- # kill 319384 00:07:41.200 14:20:37 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@978 -- # wait 319384 00:07:41.460 00:07:41.460 real 0m2.027s 00:07:41.460 user 0m2.144s 00:07:41.460 sys 0m0.785s 00:07:41.460 14:20:37 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:41.460 14:20:37 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:41.460 ************************************ 00:07:41.460 END TEST locking_app_on_locked_coremask 00:07:41.460 ************************************ 00:07:41.460 14:20:37 event.cpu_locks -- event/cpu_locks.sh@171 -- # run_test locking_overlapped_coremask locking_overlapped_coremask 00:07:41.460 14:20:37 event.cpu_locks -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:41.460 14:20:37 event.cpu_locks -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:41.460 14:20:37 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:07:41.460 ************************************ 00:07:41.460 START TEST locking_overlapped_coremask 00:07:41.460 ************************************ 00:07:41.460 14:20:37 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@1129 -- # locking_overlapped_coremask 00:07:41.460 14:20:37 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@132 -- # spdk_tgt_pid=319824 00:07:41.460 14:20:37 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@133 -- # waitforlisten 319824 /var/tmp/spdk.sock 00:07:41.460 14:20:37 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@131 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x7 00:07:41.460 14:20:37 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@835 -- # '[' -z 319824 ']' 00:07:41.460 14:20:37 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:41.460 14:20:37 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:41.460 14:20:37 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:41.460 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:41.460 14:20:37 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:41.460 14:20:37 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:41.460 [2024-11-18 14:20:37.481476] Starting SPDK v25.01-pre git sha1 d47eb51c9 / DPDK 23.11.0 initialization... 00:07:41.460 [2024-11-18 14:20:37.481533] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid319824 ] 00:07:41.460 [2024-11-18 14:20:37.566934] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 3 00:07:41.721 [2024-11-18 14:20:37.592324] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 1 00:07:41.721 [2024-11-18 14:20:37.592434] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:41.721 [2024-11-18 14:20:37.592435] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 2 00:07:41.721 14:20:37 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:41.721 14:20:37 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@868 -- # return 0 00:07:41.721 14:20:37 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@136 -- # spdk_tgt_pid2=319949 00:07:41.721 14:20:37 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@137 -- # NOT waitforlisten 319949 /var/tmp/spdk2.sock 00:07:41.721 14:20:37 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@135 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1c -r /var/tmp/spdk2.sock 00:07:41.721 14:20:37 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@652 -- # local es=0 00:07:41.721 14:20:37 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@654 -- # valid_exec_arg waitforlisten 319949 /var/tmp/spdk2.sock 00:07:41.721 14:20:37 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@640 -- # local arg=waitforlisten 00:07:41.721 14:20:37 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:07:41.721 14:20:37 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@644 -- # type -t waitforlisten 00:07:41.721 14:20:37 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:07:41.721 14:20:37 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@655 -- # waitforlisten 319949 /var/tmp/spdk2.sock 00:07:41.721 14:20:37 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@835 -- # '[' -z 319949 ']' 00:07:41.721 14:20:37 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk2.sock 00:07:41.721 14:20:37 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:41.721 14:20:37 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:07:41.721 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:07:41.721 14:20:37 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:41.721 14:20:37 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:41.721 [2024-11-18 14:20:37.821666] Starting SPDK v25.01-pre git sha1 d47eb51c9 / DPDK 23.11.0 initialization... 00:07:41.721 [2024-11-18 14:20:37.821757] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1c --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid319949 ] 00:07:41.981 [2024-11-18 14:20:37.924049] app.c: 782:claim_cpu_cores: *ERROR*: Cannot create lock on core 2, probably process 319824 has claimed it. 00:07:41.981 [2024-11-18 14:20:37.924084] app.c: 912:spdk_app_start: *ERROR*: Unable to acquire lock on assigned core mask - exiting. 00:07:42.551 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 850: kill: (319949) - No such process 00:07:42.551 ERROR: process (pid: 319949) is no longer running 00:07:42.551 14:20:38 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:42.551 14:20:38 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@868 -- # return 1 00:07:42.551 14:20:38 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@655 -- # es=1 00:07:42.551 14:20:38 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@663 -- # (( es > 128 )) 00:07:42.551 14:20:38 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@674 -- # [[ -n '' ]] 00:07:42.551 14:20:38 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@679 -- # (( !es == 0 )) 00:07:42.551 14:20:38 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@139 -- # check_remaining_locks 00:07:42.551 14:20:38 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@36 -- # locks=(/var/tmp/spdk_cpu_lock_*) 00:07:42.551 14:20:38 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@37 -- # locks_expected=(/var/tmp/spdk_cpu_lock_{000..002}) 00:07:42.551 14:20:38 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@38 -- # [[ /var/tmp/spdk_cpu_lock_000 /var/tmp/spdk_cpu_lock_001 /var/tmp/spdk_cpu_lock_002 == \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\0\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\1\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\2 ]] 00:07:42.551 14:20:38 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@141 -- # killprocess 319824 00:07:42.551 14:20:38 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@954 -- # '[' -z 319824 ']' 00:07:42.551 14:20:38 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@958 -- # kill -0 319824 00:07:42.551 14:20:38 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@959 -- # uname 00:07:42.551 14:20:38 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:42.551 14:20:38 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 319824 00:07:42.551 14:20:38 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:07:42.551 14:20:38 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:07:42.551 14:20:38 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@972 -- # echo 'killing process with pid 319824' 00:07:42.551 killing process with pid 319824 00:07:42.551 14:20:38 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@973 -- # kill 319824 00:07:42.551 14:20:38 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@978 -- # wait 319824 00:07:42.812 00:07:42.812 real 0m1.383s 00:07:42.812 user 0m3.832s 00:07:42.812 sys 0m0.436s 00:07:42.812 14:20:38 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:42.812 14:20:38 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:42.812 ************************************ 00:07:42.812 END TEST locking_overlapped_coremask 00:07:42.812 ************************************ 00:07:42.812 14:20:38 event.cpu_locks -- event/cpu_locks.sh@172 -- # run_test locking_overlapped_coremask_via_rpc locking_overlapped_coremask_via_rpc 00:07:42.812 14:20:38 event.cpu_locks -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:42.812 14:20:38 event.cpu_locks -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:42.812 14:20:38 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:07:42.812 ************************************ 00:07:42.812 START TEST locking_overlapped_coremask_via_rpc 00:07:42.812 ************************************ 00:07:42.812 14:20:38 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@1129 -- # locking_overlapped_coremask_via_rpc 00:07:42.812 14:20:38 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@148 -- # spdk_tgt_pid=320081 00:07:42.812 14:20:38 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@149 -- # waitforlisten 320081 /var/tmp/spdk.sock 00:07:42.812 14:20:38 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@147 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x7 --disable-cpumask-locks 00:07:42.812 14:20:38 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@835 -- # '[' -z 320081 ']' 00:07:42.812 14:20:38 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:42.812 14:20:38 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:42.812 14:20:38 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:42.812 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:42.812 14:20:38 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:42.812 14:20:38 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:43.073 [2024-11-18 14:20:38.952218] Starting SPDK v25.01-pre git sha1 d47eb51c9 / DPDK 23.11.0 initialization... 00:07:43.073 [2024-11-18 14:20:38.952297] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid320081 ] 00:07:43.073 [2024-11-18 14:20:39.022111] app.c: 916:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:07:43.073 [2024-11-18 14:20:39.022139] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 3 00:07:43.073 [2024-11-18 14:20:39.047418] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 1 00:07:43.073 [2024-11-18 14:20:39.047528] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:43.073 [2024-11-18 14:20:39.047529] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 2 00:07:43.334 14:20:39 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:43.334 14:20:39 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@868 -- # return 0 00:07:43.334 14:20:39 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@152 -- # spdk_tgt_pid2=320244 00:07:43.334 14:20:39 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@153 -- # waitforlisten 320244 /var/tmp/spdk2.sock 00:07:43.334 14:20:39 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@151 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1c -r /var/tmp/spdk2.sock --disable-cpumask-locks 00:07:43.334 14:20:39 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@835 -- # '[' -z 320244 ']' 00:07:43.334 14:20:39 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk2.sock 00:07:43.334 14:20:39 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:43.334 14:20:39 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:07:43.334 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:07:43.334 14:20:39 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:43.334 14:20:39 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:43.334 [2024-11-18 14:20:39.281468] Starting SPDK v25.01-pre git sha1 d47eb51c9 / DPDK 23.11.0 initialization... 00:07:43.334 [2024-11-18 14:20:39.281533] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1c --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid320244 ] 00:07:43.334 [2024-11-18 14:20:39.383084] app.c: 916:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:07:43.334 [2024-11-18 14:20:39.383115] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 3 00:07:43.334 [2024-11-18 14:20:39.431851] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 3 00:07:43.334 [2024-11-18 14:20:39.431947] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 2 00:07:43.334 [2024-11-18 14:20:39.431948] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 4 00:07:44.276 14:20:40 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:44.276 14:20:40 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@868 -- # return 0 00:07:44.276 14:20:40 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@155 -- # rpc_cmd framework_enable_cpumask_locks 00:07:44.276 14:20:40 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:44.276 14:20:40 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:44.276 14:20:40 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:44.276 14:20:40 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@156 -- # NOT rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:07:44.276 14:20:40 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@652 -- # local es=0 00:07:44.276 14:20:40 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@654 -- # valid_exec_arg rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:07:44.276 14:20:40 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@640 -- # local arg=rpc_cmd 00:07:44.276 14:20:40 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:07:44.276 14:20:40 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@644 -- # type -t rpc_cmd 00:07:44.276 14:20:40 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:07:44.276 14:20:40 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@655 -- # rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:07:44.276 14:20:40 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:44.276 14:20:40 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:44.276 [2024-11-18 14:20:40.162615] app.c: 782:claim_cpu_cores: *ERROR*: Cannot create lock on core 2, probably process 320081 has claimed it. 00:07:44.276 request: 00:07:44.276 { 00:07:44.276 "method": "framework_enable_cpumask_locks", 00:07:44.276 "req_id": 1 00:07:44.276 } 00:07:44.276 Got JSON-RPC error response 00:07:44.276 response: 00:07:44.276 { 00:07:44.276 "code": -32603, 00:07:44.276 "message": "Failed to claim CPU core: 2" 00:07:44.276 } 00:07:44.276 14:20:40 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@591 -- # [[ 1 == 0 ]] 00:07:44.276 14:20:40 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@655 -- # es=1 00:07:44.276 14:20:40 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@663 -- # (( es > 128 )) 00:07:44.276 14:20:40 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@674 -- # [[ -n '' ]] 00:07:44.276 14:20:40 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@679 -- # (( !es == 0 )) 00:07:44.276 14:20:40 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@158 -- # waitforlisten 320081 /var/tmp/spdk.sock 00:07:44.276 14:20:40 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@835 -- # '[' -z 320081 ']' 00:07:44.276 14:20:40 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:44.276 14:20:40 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:44.276 14:20:40 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:44.276 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:44.276 14:20:40 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:44.276 14:20:40 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:44.276 14:20:40 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:44.276 14:20:40 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@868 -- # return 0 00:07:44.276 14:20:40 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@159 -- # waitforlisten 320244 /var/tmp/spdk2.sock 00:07:44.276 14:20:40 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@835 -- # '[' -z 320244 ']' 00:07:44.276 14:20:40 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk2.sock 00:07:44.276 14:20:40 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:44.276 14:20:40 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:07:44.276 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:07:44.276 14:20:40 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:44.276 14:20:40 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:44.541 14:20:40 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:44.541 14:20:40 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@868 -- # return 0 00:07:44.541 14:20:40 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@161 -- # check_remaining_locks 00:07:44.541 14:20:40 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@36 -- # locks=(/var/tmp/spdk_cpu_lock_*) 00:07:44.541 14:20:40 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@37 -- # locks_expected=(/var/tmp/spdk_cpu_lock_{000..002}) 00:07:44.541 14:20:40 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@38 -- # [[ /var/tmp/spdk_cpu_lock_000 /var/tmp/spdk_cpu_lock_001 /var/tmp/spdk_cpu_lock_002 == \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\0\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\1\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\2 ]] 00:07:44.541 00:07:44.541 real 0m1.655s 00:07:44.541 user 0m0.818s 00:07:44.541 sys 0m0.165s 00:07:44.541 14:20:40 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:44.541 14:20:40 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:44.541 ************************************ 00:07:44.541 END TEST locking_overlapped_coremask_via_rpc 00:07:44.541 ************************************ 00:07:44.541 14:20:40 event.cpu_locks -- event/cpu_locks.sh@174 -- # cleanup 00:07:44.541 14:20:40 event.cpu_locks -- event/cpu_locks.sh@15 -- # [[ -z 320081 ]] 00:07:44.541 14:20:40 event.cpu_locks -- event/cpu_locks.sh@15 -- # killprocess 320081 00:07:44.541 14:20:40 event.cpu_locks -- common/autotest_common.sh@954 -- # '[' -z 320081 ']' 00:07:44.541 14:20:40 event.cpu_locks -- common/autotest_common.sh@958 -- # kill -0 320081 00:07:44.541 14:20:40 event.cpu_locks -- common/autotest_common.sh@959 -- # uname 00:07:44.541 14:20:40 event.cpu_locks -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:44.541 14:20:40 event.cpu_locks -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 320081 00:07:44.801 14:20:40 event.cpu_locks -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:07:44.801 14:20:40 event.cpu_locks -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:07:44.801 14:20:40 event.cpu_locks -- common/autotest_common.sh@972 -- # echo 'killing process with pid 320081' 00:07:44.801 killing process with pid 320081 00:07:44.801 14:20:40 event.cpu_locks -- common/autotest_common.sh@973 -- # kill 320081 00:07:44.801 14:20:40 event.cpu_locks -- common/autotest_common.sh@978 -- # wait 320081 00:07:45.061 14:20:40 event.cpu_locks -- event/cpu_locks.sh@16 -- # [[ -z 320244 ]] 00:07:45.061 14:20:40 event.cpu_locks -- event/cpu_locks.sh@16 -- # killprocess 320244 00:07:45.061 14:20:40 event.cpu_locks -- common/autotest_common.sh@954 -- # '[' -z 320244 ']' 00:07:45.061 14:20:40 event.cpu_locks -- common/autotest_common.sh@958 -- # kill -0 320244 00:07:45.061 14:20:40 event.cpu_locks -- common/autotest_common.sh@959 -- # uname 00:07:45.062 14:20:40 event.cpu_locks -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:45.062 14:20:40 event.cpu_locks -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 320244 00:07:45.062 14:20:41 event.cpu_locks -- common/autotest_common.sh@960 -- # process_name=reactor_2 00:07:45.062 14:20:41 event.cpu_locks -- common/autotest_common.sh@964 -- # '[' reactor_2 = sudo ']' 00:07:45.062 14:20:41 event.cpu_locks -- common/autotest_common.sh@972 -- # echo 'killing process with pid 320244' 00:07:45.062 killing process with pid 320244 00:07:45.062 14:20:41 event.cpu_locks -- common/autotest_common.sh@973 -- # kill 320244 00:07:45.062 14:20:41 event.cpu_locks -- common/autotest_common.sh@978 -- # wait 320244 00:07:45.321 14:20:41 event.cpu_locks -- event/cpu_locks.sh@18 -- # rm -f 00:07:45.321 14:20:41 event.cpu_locks -- event/cpu_locks.sh@1 -- # cleanup 00:07:45.321 14:20:41 event.cpu_locks -- event/cpu_locks.sh@15 -- # [[ -z 320081 ]] 00:07:45.321 14:20:41 event.cpu_locks -- event/cpu_locks.sh@15 -- # killprocess 320081 00:07:45.321 14:20:41 event.cpu_locks -- common/autotest_common.sh@954 -- # '[' -z 320081 ']' 00:07:45.321 14:20:41 event.cpu_locks -- common/autotest_common.sh@958 -- # kill -0 320081 00:07:45.321 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 958: kill: (320081) - No such process 00:07:45.321 14:20:41 event.cpu_locks -- common/autotest_common.sh@981 -- # echo 'Process with pid 320081 is not found' 00:07:45.321 Process with pid 320081 is not found 00:07:45.321 14:20:41 event.cpu_locks -- event/cpu_locks.sh@16 -- # [[ -z 320244 ]] 00:07:45.321 14:20:41 event.cpu_locks -- event/cpu_locks.sh@16 -- # killprocess 320244 00:07:45.321 14:20:41 event.cpu_locks -- common/autotest_common.sh@954 -- # '[' -z 320244 ']' 00:07:45.321 14:20:41 event.cpu_locks -- common/autotest_common.sh@958 -- # kill -0 320244 00:07:45.321 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 958: kill: (320244) - No such process 00:07:45.321 14:20:41 event.cpu_locks -- common/autotest_common.sh@981 -- # echo 'Process with pid 320244 is not found' 00:07:45.321 Process with pid 320244 is not found 00:07:45.321 14:20:41 event.cpu_locks -- event/cpu_locks.sh@18 -- # rm -f 00:07:45.321 00:07:45.321 real 0m14.415s 00:07:45.321 user 0m24.572s 00:07:45.321 sys 0m5.817s 00:07:45.321 14:20:41 event.cpu_locks -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:45.321 14:20:41 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:07:45.321 ************************************ 00:07:45.321 END TEST cpu_locks 00:07:45.321 ************************************ 00:07:45.321 00:07:45.321 real 0m38.939s 00:07:45.321 user 1m12.582s 00:07:45.321 sys 0m10.275s 00:07:45.321 14:20:41 event -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:45.321 14:20:41 event -- common/autotest_common.sh@10 -- # set +x 00:07:45.321 ************************************ 00:07:45.321 END TEST event 00:07:45.321 ************************************ 00:07:45.321 14:20:41 -- spdk/autotest.sh@169 -- # run_test thread /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/thread.sh 00:07:45.321 14:20:41 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:45.321 14:20:41 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:45.321 14:20:41 -- common/autotest_common.sh@10 -- # set +x 00:07:45.581 ************************************ 00:07:45.581 START TEST thread 00:07:45.581 ************************************ 00:07:45.581 14:20:41 thread -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/thread.sh 00:07:45.581 * Looking for test storage... 00:07:45.581 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread 00:07:45.581 14:20:41 thread -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:07:45.581 14:20:41 thread -- common/autotest_common.sh@1693 -- # lcov --version 00:07:45.581 14:20:41 thread -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:07:45.581 14:20:41 thread -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:07:45.581 14:20:41 thread -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:45.581 14:20:41 thread -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:45.581 14:20:41 thread -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:45.581 14:20:41 thread -- scripts/common.sh@336 -- # IFS=.-: 00:07:45.581 14:20:41 thread -- scripts/common.sh@336 -- # read -ra ver1 00:07:45.581 14:20:41 thread -- scripts/common.sh@337 -- # IFS=.-: 00:07:45.581 14:20:41 thread -- scripts/common.sh@337 -- # read -ra ver2 00:07:45.581 14:20:41 thread -- scripts/common.sh@338 -- # local 'op=<' 00:07:45.581 14:20:41 thread -- scripts/common.sh@340 -- # ver1_l=2 00:07:45.581 14:20:41 thread -- scripts/common.sh@341 -- # ver2_l=1 00:07:45.581 14:20:41 thread -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:45.581 14:20:41 thread -- scripts/common.sh@344 -- # case "$op" in 00:07:45.581 14:20:41 thread -- scripts/common.sh@345 -- # : 1 00:07:45.581 14:20:41 thread -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:45.581 14:20:41 thread -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:45.581 14:20:41 thread -- scripts/common.sh@365 -- # decimal 1 00:07:45.581 14:20:41 thread -- scripts/common.sh@353 -- # local d=1 00:07:45.581 14:20:41 thread -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:45.581 14:20:41 thread -- scripts/common.sh@355 -- # echo 1 00:07:45.581 14:20:41 thread -- scripts/common.sh@365 -- # ver1[v]=1 00:07:45.581 14:20:41 thread -- scripts/common.sh@366 -- # decimal 2 00:07:45.581 14:20:41 thread -- scripts/common.sh@353 -- # local d=2 00:07:45.581 14:20:41 thread -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:45.581 14:20:41 thread -- scripts/common.sh@355 -- # echo 2 00:07:45.581 14:20:41 thread -- scripts/common.sh@366 -- # ver2[v]=2 00:07:45.581 14:20:41 thread -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:45.581 14:20:41 thread -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:45.581 14:20:41 thread -- scripts/common.sh@368 -- # return 0 00:07:45.581 14:20:41 thread -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:45.581 14:20:41 thread -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:07:45.581 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:45.581 --rc genhtml_branch_coverage=1 00:07:45.581 --rc genhtml_function_coverage=1 00:07:45.581 --rc genhtml_legend=1 00:07:45.581 --rc geninfo_all_blocks=1 00:07:45.581 --rc geninfo_unexecuted_blocks=1 00:07:45.581 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:45.581 ' 00:07:45.582 14:20:41 thread -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:07:45.582 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:45.582 --rc genhtml_branch_coverage=1 00:07:45.582 --rc genhtml_function_coverage=1 00:07:45.582 --rc genhtml_legend=1 00:07:45.582 --rc geninfo_all_blocks=1 00:07:45.582 --rc geninfo_unexecuted_blocks=1 00:07:45.582 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:45.582 ' 00:07:45.582 14:20:41 thread -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:07:45.582 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:45.582 --rc genhtml_branch_coverage=1 00:07:45.582 --rc genhtml_function_coverage=1 00:07:45.582 --rc genhtml_legend=1 00:07:45.582 --rc geninfo_all_blocks=1 00:07:45.582 --rc geninfo_unexecuted_blocks=1 00:07:45.582 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:45.582 ' 00:07:45.582 14:20:41 thread -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:07:45.582 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:45.582 --rc genhtml_branch_coverage=1 00:07:45.582 --rc genhtml_function_coverage=1 00:07:45.582 --rc genhtml_legend=1 00:07:45.582 --rc geninfo_all_blocks=1 00:07:45.582 --rc geninfo_unexecuted_blocks=1 00:07:45.582 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:45.582 ' 00:07:45.582 14:20:41 thread -- thread/thread.sh@11 -- # run_test thread_poller_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:07:45.582 14:20:41 thread -- common/autotest_common.sh@1105 -- # '[' 8 -le 1 ']' 00:07:45.582 14:20:41 thread -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:45.582 14:20:41 thread -- common/autotest_common.sh@10 -- # set +x 00:07:45.842 ************************************ 00:07:45.842 START TEST thread_poller_perf 00:07:45.842 ************************************ 00:07:45.842 14:20:41 thread.thread_poller_perf -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:07:45.842 [2024-11-18 14:20:41.734795] Starting SPDK v25.01-pre git sha1 d47eb51c9 / DPDK 23.11.0 initialization... 00:07:45.842 [2024-11-18 14:20:41.734900] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid320640 ] 00:07:45.842 [2024-11-18 14:20:41.824020] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:45.842 [2024-11-18 14:20:41.846220] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:45.842 Running 1000 pollers for 1 seconds with 1 microseconds period. 00:07:46.783 [2024-11-18T13:20:42.913Z] ====================================== 00:07:46.783 [2024-11-18T13:20:42.913Z] busy:2503624262 (cyc) 00:07:46.783 [2024-11-18T13:20:42.913Z] total_run_count: 846000 00:07:46.783 [2024-11-18T13:20:42.913Z] tsc_hz: 2500000000 (cyc) 00:07:46.783 [2024-11-18T13:20:42.913Z] ====================================== 00:07:46.783 [2024-11-18T13:20:42.913Z] poller_cost: 2959 (cyc), 1183 (nsec) 00:07:46.783 00:07:46.783 real 0m1.160s 00:07:46.783 user 0m1.062s 00:07:46.783 sys 0m0.095s 00:07:46.783 14:20:42 thread.thread_poller_perf -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:46.783 14:20:42 thread.thread_poller_perf -- common/autotest_common.sh@10 -- # set +x 00:07:46.783 ************************************ 00:07:46.783 END TEST thread_poller_perf 00:07:46.783 ************************************ 00:07:47.043 14:20:42 thread -- thread/thread.sh@12 -- # run_test thread_poller_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:07:47.043 14:20:42 thread -- common/autotest_common.sh@1105 -- # '[' 8 -le 1 ']' 00:07:47.043 14:20:42 thread -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:47.043 14:20:42 thread -- common/autotest_common.sh@10 -- # set +x 00:07:47.043 ************************************ 00:07:47.043 START TEST thread_poller_perf 00:07:47.043 ************************************ 00:07:47.043 14:20:42 thread.thread_poller_perf -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:07:47.043 [2024-11-18 14:20:42.981097] Starting SPDK v25.01-pre git sha1 d47eb51c9 / DPDK 23.11.0 initialization... 00:07:47.043 [2024-11-18 14:20:42.981209] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid320922 ] 00:07:47.043 [2024-11-18 14:20:43.069246] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:47.043 [2024-11-18 14:20:43.092656] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:47.043 Running 1000 pollers for 1 seconds with 0 microseconds period. 00:07:48.426 [2024-11-18T13:20:44.556Z] ====================================== 00:07:48.426 [2024-11-18T13:20:44.556Z] busy:2501327690 (cyc) 00:07:48.426 [2024-11-18T13:20:44.556Z] total_run_count: 13338000 00:07:48.426 [2024-11-18T13:20:44.556Z] tsc_hz: 2500000000 (cyc) 00:07:48.426 [2024-11-18T13:20:44.556Z] ====================================== 00:07:48.426 [2024-11-18T13:20:44.556Z] poller_cost: 187 (cyc), 74 (nsec) 00:07:48.426 00:07:48.426 real 0m1.161s 00:07:48.426 user 0m1.062s 00:07:48.426 sys 0m0.095s 00:07:48.426 14:20:44 thread.thread_poller_perf -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:48.426 14:20:44 thread.thread_poller_perf -- common/autotest_common.sh@10 -- # set +x 00:07:48.426 ************************************ 00:07:48.426 END TEST thread_poller_perf 00:07:48.426 ************************************ 00:07:48.426 14:20:44 thread -- thread/thread.sh@17 -- # [[ n != \y ]] 00:07:48.426 14:20:44 thread -- thread/thread.sh@18 -- # run_test thread_spdk_lock /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock 00:07:48.426 14:20:44 thread -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:48.426 14:20:44 thread -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:48.426 14:20:44 thread -- common/autotest_common.sh@10 -- # set +x 00:07:48.426 ************************************ 00:07:48.426 START TEST thread_spdk_lock 00:07:48.426 ************************************ 00:07:48.426 14:20:44 thread.thread_spdk_lock -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock 00:07:48.426 [2024-11-18 14:20:44.225734] Starting SPDK v25.01-pre git sha1 d47eb51c9 / DPDK 23.11.0 initialization... 00:07:48.426 [2024-11-18 14:20:44.225816] [ DPDK EAL parameters: spdk_lock_test --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid321202 ] 00:07:48.426 [2024-11-18 14:20:44.317416] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:48.426 [2024-11-18 14:20:44.341969] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:48.426 [2024-11-18 14:20:44.341969] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 1 00:07:48.996 [2024-11-18 14:20:44.832619] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 980:thread_execute_poller: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:07:48.996 [2024-11-18 14:20:44.832653] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:3112:spdk_spin_lock: *ERROR*: unrecoverable spinlock error 2: Deadlock detected (thread != sspin->thread) 00:07:48.996 [2024-11-18 14:20:44.832663] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:3067:sspin_stacks_print: *ERROR*: spinlock 0x135fc40 00:07:48.996 [2024-11-18 14:20:44.833299] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 875:msg_queue_run_batch: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:07:48.996 [2024-11-18 14:20:44.833403] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:1041:thread_execute_timed_poller: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:07:48.996 [2024-11-18 14:20:44.833422] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 875:msg_queue_run_batch: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:07:48.996 Starting test contend 00:07:48.996 Worker Delay Wait us Hold us Total us 00:07:48.996 0 3 164493 185752 350245 00:07:48.996 1 5 80976 286038 367014 00:07:48.996 PASS test contend 00:07:48.996 Starting test hold_by_poller 00:07:48.996 PASS test hold_by_poller 00:07:48.996 Starting test hold_by_message 00:07:48.996 PASS test hold_by_message 00:07:48.996 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock summary: 00:07:48.996 100014 assertions passed 00:07:48.996 0 assertions failed 00:07:48.996 00:07:48.996 real 0m0.653s 00:07:48.996 user 0m1.040s 00:07:48.996 sys 0m0.101s 00:07:48.996 14:20:44 thread.thread_spdk_lock -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:48.997 14:20:44 thread.thread_spdk_lock -- common/autotest_common.sh@10 -- # set +x 00:07:48.997 ************************************ 00:07:48.997 END TEST thread_spdk_lock 00:07:48.997 ************************************ 00:07:48.997 00:07:48.997 real 0m3.426s 00:07:48.997 user 0m3.368s 00:07:48.997 sys 0m0.576s 00:07:48.997 14:20:44 thread -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:48.997 14:20:44 thread -- common/autotest_common.sh@10 -- # set +x 00:07:48.997 ************************************ 00:07:48.997 END TEST thread 00:07:48.997 ************************************ 00:07:48.997 14:20:44 -- spdk/autotest.sh@171 -- # [[ 0 -eq 1 ]] 00:07:48.997 14:20:44 -- spdk/autotest.sh@176 -- # run_test app_cmdline /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/cmdline.sh 00:07:48.997 14:20:44 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:48.997 14:20:44 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:48.997 14:20:44 -- common/autotest_common.sh@10 -- # set +x 00:07:48.997 ************************************ 00:07:48.997 START TEST app_cmdline 00:07:48.997 ************************************ 00:07:48.997 14:20:44 app_cmdline -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/cmdline.sh 00:07:48.997 * Looking for test storage... 00:07:48.997 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:07:48.997 14:20:45 app_cmdline -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:07:48.997 14:20:45 app_cmdline -- common/autotest_common.sh@1693 -- # lcov --version 00:07:48.997 14:20:45 app_cmdline -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:07:49.257 14:20:45 app_cmdline -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:07:49.257 14:20:45 app_cmdline -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:49.257 14:20:45 app_cmdline -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:49.257 14:20:45 app_cmdline -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:49.257 14:20:45 app_cmdline -- scripts/common.sh@336 -- # IFS=.-: 00:07:49.257 14:20:45 app_cmdline -- scripts/common.sh@336 -- # read -ra ver1 00:07:49.257 14:20:45 app_cmdline -- scripts/common.sh@337 -- # IFS=.-: 00:07:49.257 14:20:45 app_cmdline -- scripts/common.sh@337 -- # read -ra ver2 00:07:49.257 14:20:45 app_cmdline -- scripts/common.sh@338 -- # local 'op=<' 00:07:49.257 14:20:45 app_cmdline -- scripts/common.sh@340 -- # ver1_l=2 00:07:49.257 14:20:45 app_cmdline -- scripts/common.sh@341 -- # ver2_l=1 00:07:49.257 14:20:45 app_cmdline -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:49.257 14:20:45 app_cmdline -- scripts/common.sh@344 -- # case "$op" in 00:07:49.257 14:20:45 app_cmdline -- scripts/common.sh@345 -- # : 1 00:07:49.257 14:20:45 app_cmdline -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:49.257 14:20:45 app_cmdline -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:49.257 14:20:45 app_cmdline -- scripts/common.sh@365 -- # decimal 1 00:07:49.257 14:20:45 app_cmdline -- scripts/common.sh@353 -- # local d=1 00:07:49.257 14:20:45 app_cmdline -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:49.257 14:20:45 app_cmdline -- scripts/common.sh@355 -- # echo 1 00:07:49.257 14:20:45 app_cmdline -- scripts/common.sh@365 -- # ver1[v]=1 00:07:49.257 14:20:45 app_cmdline -- scripts/common.sh@366 -- # decimal 2 00:07:49.257 14:20:45 app_cmdline -- scripts/common.sh@353 -- # local d=2 00:07:49.257 14:20:45 app_cmdline -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:49.257 14:20:45 app_cmdline -- scripts/common.sh@355 -- # echo 2 00:07:49.257 14:20:45 app_cmdline -- scripts/common.sh@366 -- # ver2[v]=2 00:07:49.257 14:20:45 app_cmdline -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:49.257 14:20:45 app_cmdline -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:49.257 14:20:45 app_cmdline -- scripts/common.sh@368 -- # return 0 00:07:49.257 14:20:45 app_cmdline -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:49.257 14:20:45 app_cmdline -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:07:49.257 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:49.257 --rc genhtml_branch_coverage=1 00:07:49.257 --rc genhtml_function_coverage=1 00:07:49.257 --rc genhtml_legend=1 00:07:49.257 --rc geninfo_all_blocks=1 00:07:49.257 --rc geninfo_unexecuted_blocks=1 00:07:49.257 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:49.257 ' 00:07:49.257 14:20:45 app_cmdline -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:07:49.257 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:49.257 --rc genhtml_branch_coverage=1 00:07:49.257 --rc genhtml_function_coverage=1 00:07:49.257 --rc genhtml_legend=1 00:07:49.257 --rc geninfo_all_blocks=1 00:07:49.257 --rc geninfo_unexecuted_blocks=1 00:07:49.257 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:49.257 ' 00:07:49.257 14:20:45 app_cmdline -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:07:49.257 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:49.257 --rc genhtml_branch_coverage=1 00:07:49.257 --rc genhtml_function_coverage=1 00:07:49.257 --rc genhtml_legend=1 00:07:49.257 --rc geninfo_all_blocks=1 00:07:49.257 --rc geninfo_unexecuted_blocks=1 00:07:49.257 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:49.257 ' 00:07:49.257 14:20:45 app_cmdline -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:07:49.257 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:49.257 --rc genhtml_branch_coverage=1 00:07:49.257 --rc genhtml_function_coverage=1 00:07:49.257 --rc genhtml_legend=1 00:07:49.257 --rc geninfo_all_blocks=1 00:07:49.257 --rc geninfo_unexecuted_blocks=1 00:07:49.257 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:49.257 ' 00:07:49.257 14:20:45 app_cmdline -- app/cmdline.sh@14 -- # trap 'killprocess $spdk_tgt_pid' EXIT 00:07:49.258 14:20:45 app_cmdline -- app/cmdline.sh@17 -- # spdk_tgt_pid=321379 00:07:49.258 14:20:45 app_cmdline -- app/cmdline.sh@18 -- # waitforlisten 321379 00:07:49.258 14:20:45 app_cmdline -- app/cmdline.sh@16 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --rpcs-allowed spdk_get_version,rpc_get_methods 00:07:49.258 14:20:45 app_cmdline -- common/autotest_common.sh@835 -- # '[' -z 321379 ']' 00:07:49.258 14:20:45 app_cmdline -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:49.258 14:20:45 app_cmdline -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:49.258 14:20:45 app_cmdline -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:49.258 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:49.258 14:20:45 app_cmdline -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:49.258 14:20:45 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:07:49.258 [2024-11-18 14:20:45.215872] Starting SPDK v25.01-pre git sha1 d47eb51c9 / DPDK 23.11.0 initialization... 00:07:49.258 [2024-11-18 14:20:45.215958] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid321379 ] 00:07:49.258 [2024-11-18 14:20:45.303283] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:49.258 [2024-11-18 14:20:45.325483] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:49.518 14:20:45 app_cmdline -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:49.518 14:20:45 app_cmdline -- common/autotest_common.sh@868 -- # return 0 00:07:49.518 14:20:45 app_cmdline -- app/cmdline.sh@20 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py spdk_get_version 00:07:49.778 { 00:07:49.778 "version": "SPDK v25.01-pre git sha1 d47eb51c9", 00:07:49.778 "fields": { 00:07:49.778 "major": 25, 00:07:49.778 "minor": 1, 00:07:49.778 "patch": 0, 00:07:49.778 "suffix": "-pre", 00:07:49.778 "commit": "d47eb51c9" 00:07:49.778 } 00:07:49.778 } 00:07:49.778 14:20:45 app_cmdline -- app/cmdline.sh@22 -- # expected_methods=() 00:07:49.778 14:20:45 app_cmdline -- app/cmdline.sh@23 -- # expected_methods+=("rpc_get_methods") 00:07:49.778 14:20:45 app_cmdline -- app/cmdline.sh@24 -- # expected_methods+=("spdk_get_version") 00:07:49.778 14:20:45 app_cmdline -- app/cmdline.sh@26 -- # methods=($(rpc_cmd rpc_get_methods | jq -r ".[]" | sort)) 00:07:49.778 14:20:45 app_cmdline -- app/cmdline.sh@26 -- # rpc_cmd rpc_get_methods 00:07:49.778 14:20:45 app_cmdline -- app/cmdline.sh@26 -- # jq -r '.[]' 00:07:49.778 14:20:45 app_cmdline -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:49.778 14:20:45 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:07:49.778 14:20:45 app_cmdline -- app/cmdline.sh@26 -- # sort 00:07:49.778 14:20:45 app_cmdline -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:49.778 14:20:45 app_cmdline -- app/cmdline.sh@27 -- # (( 2 == 2 )) 00:07:49.778 14:20:45 app_cmdline -- app/cmdline.sh@28 -- # [[ rpc_get_methods spdk_get_version == \r\p\c\_\g\e\t\_\m\e\t\h\o\d\s\ \s\p\d\k\_\g\e\t\_\v\e\r\s\i\o\n ]] 00:07:49.778 14:20:45 app_cmdline -- app/cmdline.sh@30 -- # NOT /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:49.778 14:20:45 app_cmdline -- common/autotest_common.sh@652 -- # local es=0 00:07:49.778 14:20:45 app_cmdline -- common/autotest_common.sh@654 -- # valid_exec_arg /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:49.778 14:20:45 app_cmdline -- common/autotest_common.sh@640 -- # local arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:49.778 14:20:45 app_cmdline -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:07:49.778 14:20:45 app_cmdline -- common/autotest_common.sh@644 -- # type -t /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:49.778 14:20:45 app_cmdline -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:07:49.778 14:20:45 app_cmdline -- common/autotest_common.sh@646 -- # type -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:49.778 14:20:45 app_cmdline -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:07:49.778 14:20:45 app_cmdline -- common/autotest_common.sh@646 -- # arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:49.778 14:20:45 app_cmdline -- common/autotest_common.sh@646 -- # [[ -x /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py ]] 00:07:49.778 14:20:45 app_cmdline -- common/autotest_common.sh@655 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:50.037 request: 00:07:50.037 { 00:07:50.038 "method": "env_dpdk_get_mem_stats", 00:07:50.038 "req_id": 1 00:07:50.038 } 00:07:50.038 Got JSON-RPC error response 00:07:50.038 response: 00:07:50.038 { 00:07:50.038 "code": -32601, 00:07:50.038 "message": "Method not found" 00:07:50.038 } 00:07:50.038 14:20:45 app_cmdline -- common/autotest_common.sh@655 -- # es=1 00:07:50.038 14:20:45 app_cmdline -- common/autotest_common.sh@663 -- # (( es > 128 )) 00:07:50.038 14:20:45 app_cmdline -- common/autotest_common.sh@674 -- # [[ -n '' ]] 00:07:50.038 14:20:45 app_cmdline -- common/autotest_common.sh@679 -- # (( !es == 0 )) 00:07:50.038 14:20:45 app_cmdline -- app/cmdline.sh@1 -- # killprocess 321379 00:07:50.038 14:20:45 app_cmdline -- common/autotest_common.sh@954 -- # '[' -z 321379 ']' 00:07:50.038 14:20:45 app_cmdline -- common/autotest_common.sh@958 -- # kill -0 321379 00:07:50.038 14:20:45 app_cmdline -- common/autotest_common.sh@959 -- # uname 00:07:50.038 14:20:45 app_cmdline -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:50.038 14:20:45 app_cmdline -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 321379 00:07:50.038 14:20:46 app_cmdline -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:07:50.038 14:20:46 app_cmdline -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:07:50.038 14:20:46 app_cmdline -- common/autotest_common.sh@972 -- # echo 'killing process with pid 321379' 00:07:50.038 killing process with pid 321379 00:07:50.038 14:20:46 app_cmdline -- common/autotest_common.sh@973 -- # kill 321379 00:07:50.038 14:20:46 app_cmdline -- common/autotest_common.sh@978 -- # wait 321379 00:07:50.297 00:07:50.297 real 0m1.307s 00:07:50.297 user 0m1.472s 00:07:50.297 sys 0m0.511s 00:07:50.297 14:20:46 app_cmdline -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:50.297 14:20:46 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:07:50.297 ************************************ 00:07:50.297 END TEST app_cmdline 00:07:50.297 ************************************ 00:07:50.297 14:20:46 -- spdk/autotest.sh@177 -- # run_test version /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/version.sh 00:07:50.297 14:20:46 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:50.297 14:20:46 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:50.297 14:20:46 -- common/autotest_common.sh@10 -- # set +x 00:07:50.297 ************************************ 00:07:50.297 START TEST version 00:07:50.297 ************************************ 00:07:50.297 14:20:46 version -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/version.sh 00:07:50.556 * Looking for test storage... 00:07:50.556 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:07:50.556 14:20:46 version -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:07:50.556 14:20:46 version -- common/autotest_common.sh@1693 -- # lcov --version 00:07:50.556 14:20:46 version -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:07:50.556 14:20:46 version -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:07:50.556 14:20:46 version -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:50.556 14:20:46 version -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:50.556 14:20:46 version -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:50.556 14:20:46 version -- scripts/common.sh@336 -- # IFS=.-: 00:07:50.556 14:20:46 version -- scripts/common.sh@336 -- # read -ra ver1 00:07:50.556 14:20:46 version -- scripts/common.sh@337 -- # IFS=.-: 00:07:50.556 14:20:46 version -- scripts/common.sh@337 -- # read -ra ver2 00:07:50.556 14:20:46 version -- scripts/common.sh@338 -- # local 'op=<' 00:07:50.556 14:20:46 version -- scripts/common.sh@340 -- # ver1_l=2 00:07:50.556 14:20:46 version -- scripts/common.sh@341 -- # ver2_l=1 00:07:50.556 14:20:46 version -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:50.556 14:20:46 version -- scripts/common.sh@344 -- # case "$op" in 00:07:50.556 14:20:46 version -- scripts/common.sh@345 -- # : 1 00:07:50.556 14:20:46 version -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:50.556 14:20:46 version -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:50.556 14:20:46 version -- scripts/common.sh@365 -- # decimal 1 00:07:50.556 14:20:46 version -- scripts/common.sh@353 -- # local d=1 00:07:50.556 14:20:46 version -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:50.556 14:20:46 version -- scripts/common.sh@355 -- # echo 1 00:07:50.556 14:20:46 version -- scripts/common.sh@365 -- # ver1[v]=1 00:07:50.556 14:20:46 version -- scripts/common.sh@366 -- # decimal 2 00:07:50.556 14:20:46 version -- scripts/common.sh@353 -- # local d=2 00:07:50.556 14:20:46 version -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:50.557 14:20:46 version -- scripts/common.sh@355 -- # echo 2 00:07:50.557 14:20:46 version -- scripts/common.sh@366 -- # ver2[v]=2 00:07:50.557 14:20:46 version -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:50.557 14:20:46 version -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:50.557 14:20:46 version -- scripts/common.sh@368 -- # return 0 00:07:50.557 14:20:46 version -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:50.557 14:20:46 version -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:07:50.557 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:50.557 --rc genhtml_branch_coverage=1 00:07:50.557 --rc genhtml_function_coverage=1 00:07:50.557 --rc genhtml_legend=1 00:07:50.557 --rc geninfo_all_blocks=1 00:07:50.557 --rc geninfo_unexecuted_blocks=1 00:07:50.557 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:50.557 ' 00:07:50.557 14:20:46 version -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:07:50.557 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:50.557 --rc genhtml_branch_coverage=1 00:07:50.557 --rc genhtml_function_coverage=1 00:07:50.557 --rc genhtml_legend=1 00:07:50.557 --rc geninfo_all_blocks=1 00:07:50.557 --rc geninfo_unexecuted_blocks=1 00:07:50.557 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:50.557 ' 00:07:50.557 14:20:46 version -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:07:50.557 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:50.557 --rc genhtml_branch_coverage=1 00:07:50.557 --rc genhtml_function_coverage=1 00:07:50.557 --rc genhtml_legend=1 00:07:50.557 --rc geninfo_all_blocks=1 00:07:50.557 --rc geninfo_unexecuted_blocks=1 00:07:50.557 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:50.557 ' 00:07:50.557 14:20:46 version -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:07:50.557 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:50.557 --rc genhtml_branch_coverage=1 00:07:50.557 --rc genhtml_function_coverage=1 00:07:50.557 --rc genhtml_legend=1 00:07:50.557 --rc geninfo_all_blocks=1 00:07:50.557 --rc geninfo_unexecuted_blocks=1 00:07:50.557 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:50.557 ' 00:07:50.557 14:20:46 version -- app/version.sh@17 -- # get_header_version major 00:07:50.557 14:20:46 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MAJOR[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:50.557 14:20:46 version -- app/version.sh@14 -- # cut -f2 00:07:50.557 14:20:46 version -- app/version.sh@14 -- # tr -d '"' 00:07:50.557 14:20:46 version -- app/version.sh@17 -- # major=25 00:07:50.557 14:20:46 version -- app/version.sh@18 -- # get_header_version minor 00:07:50.557 14:20:46 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MINOR[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:50.557 14:20:46 version -- app/version.sh@14 -- # cut -f2 00:07:50.557 14:20:46 version -- app/version.sh@14 -- # tr -d '"' 00:07:50.557 14:20:46 version -- app/version.sh@18 -- # minor=1 00:07:50.557 14:20:46 version -- app/version.sh@19 -- # get_header_version patch 00:07:50.557 14:20:46 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_PATCH[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:50.557 14:20:46 version -- app/version.sh@14 -- # cut -f2 00:07:50.557 14:20:46 version -- app/version.sh@14 -- # tr -d '"' 00:07:50.557 14:20:46 version -- app/version.sh@19 -- # patch=0 00:07:50.557 14:20:46 version -- app/version.sh@20 -- # get_header_version suffix 00:07:50.557 14:20:46 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_SUFFIX[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:50.557 14:20:46 version -- app/version.sh@14 -- # cut -f2 00:07:50.557 14:20:46 version -- app/version.sh@14 -- # tr -d '"' 00:07:50.557 14:20:46 version -- app/version.sh@20 -- # suffix=-pre 00:07:50.557 14:20:46 version -- app/version.sh@22 -- # version=25.1 00:07:50.557 14:20:46 version -- app/version.sh@25 -- # (( patch != 0 )) 00:07:50.557 14:20:46 version -- app/version.sh@28 -- # version=25.1rc0 00:07:50.557 14:20:46 version -- app/version.sh@30 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:07:50.557 14:20:46 version -- app/version.sh@30 -- # python3 -c 'import spdk; print(spdk.__version__)' 00:07:50.557 14:20:46 version -- app/version.sh@30 -- # py_version=25.1rc0 00:07:50.557 14:20:46 version -- app/version.sh@31 -- # [[ 25.1rc0 == \2\5\.\1\r\c\0 ]] 00:07:50.557 00:07:50.557 real 0m0.277s 00:07:50.557 user 0m0.163s 00:07:50.557 sys 0m0.171s 00:07:50.557 14:20:46 version -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:50.557 14:20:46 version -- common/autotest_common.sh@10 -- # set +x 00:07:50.557 ************************************ 00:07:50.557 END TEST version 00:07:50.557 ************************************ 00:07:50.817 14:20:46 -- spdk/autotest.sh@179 -- # '[' 0 -eq 1 ']' 00:07:50.817 14:20:46 -- spdk/autotest.sh@188 -- # [[ 0 -eq 1 ]] 00:07:50.817 14:20:46 -- spdk/autotest.sh@194 -- # uname -s 00:07:50.817 14:20:46 -- spdk/autotest.sh@194 -- # [[ Linux == Linux ]] 00:07:50.817 14:20:46 -- spdk/autotest.sh@195 -- # [[ 0 -eq 1 ]] 00:07:50.817 14:20:46 -- spdk/autotest.sh@195 -- # [[ 0 -eq 1 ]] 00:07:50.817 14:20:46 -- spdk/autotest.sh@207 -- # '[' 0 -eq 1 ']' 00:07:50.817 14:20:46 -- spdk/autotest.sh@256 -- # '[' 0 -eq 1 ']' 00:07:50.817 14:20:46 -- spdk/autotest.sh@260 -- # timing_exit lib 00:07:50.817 14:20:46 -- common/autotest_common.sh@732 -- # xtrace_disable 00:07:50.817 14:20:46 -- common/autotest_common.sh@10 -- # set +x 00:07:50.817 14:20:46 -- spdk/autotest.sh@262 -- # '[' 0 -eq 1 ']' 00:07:50.817 14:20:46 -- spdk/autotest.sh@267 -- # '[' 0 -eq 1 ']' 00:07:50.817 14:20:46 -- spdk/autotest.sh@276 -- # '[' 0 -eq 1 ']' 00:07:50.817 14:20:46 -- spdk/autotest.sh@311 -- # '[' 0 -eq 1 ']' 00:07:50.817 14:20:46 -- spdk/autotest.sh@315 -- # '[' 0 -eq 1 ']' 00:07:50.817 14:20:46 -- spdk/autotest.sh@319 -- # '[' 0 -eq 1 ']' 00:07:50.817 14:20:46 -- spdk/autotest.sh@324 -- # '[' 0 -eq 1 ']' 00:07:50.817 14:20:46 -- spdk/autotest.sh@333 -- # '[' 0 -eq 1 ']' 00:07:50.817 14:20:46 -- spdk/autotest.sh@338 -- # '[' 0 -eq 1 ']' 00:07:50.817 14:20:46 -- spdk/autotest.sh@342 -- # '[' 0 -eq 1 ']' 00:07:50.817 14:20:46 -- spdk/autotest.sh@346 -- # '[' 0 -eq 1 ']' 00:07:50.817 14:20:46 -- spdk/autotest.sh@350 -- # '[' 0 -eq 1 ']' 00:07:50.817 14:20:46 -- spdk/autotest.sh@355 -- # '[' 0 -eq 1 ']' 00:07:50.817 14:20:46 -- spdk/autotest.sh@359 -- # '[' 0 -eq 1 ']' 00:07:50.817 14:20:46 -- spdk/autotest.sh@366 -- # [[ 0 -eq 1 ]] 00:07:50.817 14:20:46 -- spdk/autotest.sh@370 -- # [[ 0 -eq 1 ]] 00:07:50.817 14:20:46 -- spdk/autotest.sh@374 -- # [[ 1 -eq 1 ]] 00:07:50.817 14:20:46 -- spdk/autotest.sh@375 -- # run_test llvm_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm.sh 00:07:50.817 14:20:46 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:50.817 14:20:46 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:50.817 14:20:46 -- common/autotest_common.sh@10 -- # set +x 00:07:50.817 ************************************ 00:07:50.817 START TEST llvm_fuzz 00:07:50.817 ************************************ 00:07:50.817 14:20:46 llvm_fuzz -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm.sh 00:07:50.817 * Looking for test storage... 00:07:50.817 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz 00:07:50.817 14:20:46 llvm_fuzz -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:07:50.817 14:20:46 llvm_fuzz -- common/autotest_common.sh@1693 -- # lcov --version 00:07:50.817 14:20:46 llvm_fuzz -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:07:51.077 14:20:46 llvm_fuzz -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:07:51.077 14:20:46 llvm_fuzz -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:51.077 14:20:46 llvm_fuzz -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:51.077 14:20:46 llvm_fuzz -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:51.077 14:20:46 llvm_fuzz -- scripts/common.sh@336 -- # IFS=.-: 00:07:51.077 14:20:46 llvm_fuzz -- scripts/common.sh@336 -- # read -ra ver1 00:07:51.077 14:20:46 llvm_fuzz -- scripts/common.sh@337 -- # IFS=.-: 00:07:51.077 14:20:46 llvm_fuzz -- scripts/common.sh@337 -- # read -ra ver2 00:07:51.077 14:20:46 llvm_fuzz -- scripts/common.sh@338 -- # local 'op=<' 00:07:51.077 14:20:46 llvm_fuzz -- scripts/common.sh@340 -- # ver1_l=2 00:07:51.077 14:20:46 llvm_fuzz -- scripts/common.sh@341 -- # ver2_l=1 00:07:51.077 14:20:46 llvm_fuzz -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:51.077 14:20:46 llvm_fuzz -- scripts/common.sh@344 -- # case "$op" in 00:07:51.077 14:20:46 llvm_fuzz -- scripts/common.sh@345 -- # : 1 00:07:51.077 14:20:46 llvm_fuzz -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:51.077 14:20:46 llvm_fuzz -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:51.077 14:20:46 llvm_fuzz -- scripts/common.sh@365 -- # decimal 1 00:07:51.077 14:20:46 llvm_fuzz -- scripts/common.sh@353 -- # local d=1 00:07:51.077 14:20:46 llvm_fuzz -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:51.077 14:20:46 llvm_fuzz -- scripts/common.sh@355 -- # echo 1 00:07:51.077 14:20:46 llvm_fuzz -- scripts/common.sh@365 -- # ver1[v]=1 00:07:51.077 14:20:46 llvm_fuzz -- scripts/common.sh@366 -- # decimal 2 00:07:51.077 14:20:46 llvm_fuzz -- scripts/common.sh@353 -- # local d=2 00:07:51.077 14:20:46 llvm_fuzz -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:51.077 14:20:46 llvm_fuzz -- scripts/common.sh@355 -- # echo 2 00:07:51.077 14:20:46 llvm_fuzz -- scripts/common.sh@366 -- # ver2[v]=2 00:07:51.077 14:20:46 llvm_fuzz -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:51.077 14:20:46 llvm_fuzz -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:51.077 14:20:46 llvm_fuzz -- scripts/common.sh@368 -- # return 0 00:07:51.077 14:20:46 llvm_fuzz -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:51.077 14:20:46 llvm_fuzz -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:07:51.077 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:51.077 --rc genhtml_branch_coverage=1 00:07:51.077 --rc genhtml_function_coverage=1 00:07:51.077 --rc genhtml_legend=1 00:07:51.077 --rc geninfo_all_blocks=1 00:07:51.077 --rc geninfo_unexecuted_blocks=1 00:07:51.077 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:51.077 ' 00:07:51.077 14:20:46 llvm_fuzz -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:07:51.077 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:51.077 --rc genhtml_branch_coverage=1 00:07:51.077 --rc genhtml_function_coverage=1 00:07:51.077 --rc genhtml_legend=1 00:07:51.077 --rc geninfo_all_blocks=1 00:07:51.077 --rc geninfo_unexecuted_blocks=1 00:07:51.077 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:51.077 ' 00:07:51.077 14:20:46 llvm_fuzz -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:07:51.077 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:51.077 --rc genhtml_branch_coverage=1 00:07:51.077 --rc genhtml_function_coverage=1 00:07:51.077 --rc genhtml_legend=1 00:07:51.077 --rc geninfo_all_blocks=1 00:07:51.077 --rc geninfo_unexecuted_blocks=1 00:07:51.077 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:51.077 ' 00:07:51.077 14:20:46 llvm_fuzz -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:07:51.077 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:51.077 --rc genhtml_branch_coverage=1 00:07:51.077 --rc genhtml_function_coverage=1 00:07:51.077 --rc genhtml_legend=1 00:07:51.077 --rc geninfo_all_blocks=1 00:07:51.077 --rc geninfo_unexecuted_blocks=1 00:07:51.077 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:51.077 ' 00:07:51.077 14:20:46 llvm_fuzz -- fuzz/llvm.sh@11 -- # fuzzers=($(get_fuzzer_targets)) 00:07:51.077 14:20:46 llvm_fuzz -- fuzz/llvm.sh@11 -- # get_fuzzer_targets 00:07:51.077 14:20:47 llvm_fuzz -- common/autotest_common.sh@550 -- # fuzzers=() 00:07:51.077 14:20:47 llvm_fuzz -- common/autotest_common.sh@550 -- # local fuzzers 00:07:51.077 14:20:47 llvm_fuzz -- common/autotest_common.sh@552 -- # [[ -n '' ]] 00:07:51.077 14:20:47 llvm_fuzz -- common/autotest_common.sh@555 -- # fuzzers=("$rootdir/test/fuzz/llvm/"*) 00:07:51.077 14:20:47 llvm_fuzz -- common/autotest_common.sh@556 -- # fuzzers=("${fuzzers[@]##*/}") 00:07:51.077 14:20:47 llvm_fuzz -- common/autotest_common.sh@559 -- # echo 'common.sh llvm-gcov.sh nvmf vfio' 00:07:51.077 14:20:47 llvm_fuzz -- fuzz/llvm.sh@13 -- # llvm_out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm 00:07:51.077 14:20:47 llvm_fuzz -- fuzz/llvm.sh@15 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm 00:07:51.077 14:20:47 llvm_fuzz -- fuzz/llvm.sh@17 -- # for fuzzer in "${fuzzers[@]}" 00:07:51.077 14:20:47 llvm_fuzz -- fuzz/llvm.sh@18 -- # case "$fuzzer" in 00:07:51.077 14:20:47 llvm_fuzz -- fuzz/llvm.sh@17 -- # for fuzzer in "${fuzzers[@]}" 00:07:51.077 14:20:47 llvm_fuzz -- fuzz/llvm.sh@18 -- # case "$fuzzer" in 00:07:51.077 14:20:47 llvm_fuzz -- fuzz/llvm.sh@17 -- # for fuzzer in "${fuzzers[@]}" 00:07:51.077 14:20:47 llvm_fuzz -- fuzz/llvm.sh@18 -- # case "$fuzzer" in 00:07:51.077 14:20:47 llvm_fuzz -- fuzz/llvm.sh@19 -- # run_test nvmf_llvm_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/run.sh 00:07:51.077 14:20:47 llvm_fuzz -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:51.077 14:20:47 llvm_fuzz -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:51.077 14:20:47 llvm_fuzz -- common/autotest_common.sh@10 -- # set +x 00:07:51.077 ************************************ 00:07:51.077 START TEST nvmf_llvm_fuzz 00:07:51.077 ************************************ 00:07:51.077 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/run.sh 00:07:51.077 * Looking for test storage... 00:07:51.077 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:51.077 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:07:51.077 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1693 -- # lcov --version 00:07:51.077 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:07:51.341 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:07:51.341 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:51.341 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:51.341 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:51.341 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@336 -- # IFS=.-: 00:07:51.341 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@336 -- # read -ra ver1 00:07:51.341 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@337 -- # IFS=.-: 00:07:51.341 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@337 -- # read -ra ver2 00:07:51.341 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@338 -- # local 'op=<' 00:07:51.341 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@340 -- # ver1_l=2 00:07:51.341 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@341 -- # ver2_l=1 00:07:51.341 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:51.341 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@344 -- # case "$op" in 00:07:51.341 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@345 -- # : 1 00:07:51.341 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:51.341 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:51.341 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@365 -- # decimal 1 00:07:51.341 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@353 -- # local d=1 00:07:51.341 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:51.341 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@355 -- # echo 1 00:07:51.341 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@365 -- # ver1[v]=1 00:07:51.341 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@366 -- # decimal 2 00:07:51.341 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@353 -- # local d=2 00:07:51.341 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:51.341 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@355 -- # echo 2 00:07:51.341 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@366 -- # ver2[v]=2 00:07:51.341 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:51.341 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:51.341 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@368 -- # return 0 00:07:51.341 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:51.341 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:07:51.341 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:51.341 --rc genhtml_branch_coverage=1 00:07:51.341 --rc genhtml_function_coverage=1 00:07:51.341 --rc genhtml_legend=1 00:07:51.341 --rc geninfo_all_blocks=1 00:07:51.341 --rc geninfo_unexecuted_blocks=1 00:07:51.341 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:51.341 ' 00:07:51.341 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:07:51.341 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:51.341 --rc genhtml_branch_coverage=1 00:07:51.341 --rc genhtml_function_coverage=1 00:07:51.341 --rc genhtml_legend=1 00:07:51.341 --rc geninfo_all_blocks=1 00:07:51.341 --rc geninfo_unexecuted_blocks=1 00:07:51.341 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:51.341 ' 00:07:51.341 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:07:51.341 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:51.341 --rc genhtml_branch_coverage=1 00:07:51.341 --rc genhtml_function_coverage=1 00:07:51.341 --rc genhtml_legend=1 00:07:51.341 --rc geninfo_all_blocks=1 00:07:51.341 --rc geninfo_unexecuted_blocks=1 00:07:51.341 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:51.341 ' 00:07:51.341 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:07:51.341 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:51.341 --rc genhtml_branch_coverage=1 00:07:51.341 --rc genhtml_function_coverage=1 00:07:51.341 --rc genhtml_legend=1 00:07:51.341 --rc geninfo_all_blocks=1 00:07:51.341 --rc geninfo_unexecuted_blocks=1 00:07:51.341 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:51.341 ' 00:07:51.341 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@60 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/common.sh 00:07:51.341 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- setup/common.sh@6 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh 00:07:51.341 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:07:51.341 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@34 -- # set -e 00:07:51.341 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:07:51.341 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@36 -- # shopt -s extglob 00:07:51.341 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@37 -- # shopt -s inherit_errexit 00:07:51.341 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@39 -- # '[' -z /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output ']' 00:07:51.341 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@44 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh ]] 00:07:51.342 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@45 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh 00:07:51.342 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:07:51.342 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:07:51.342 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=n 00:07:51.342 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:07:51.342 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:07:51.342 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:07:51.342 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:07:51.342 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:07:51.342 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:07:51.342 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:07:51.342 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:07:51.342 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:07:51.342 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:07:51.342 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:07:51.342 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:07:51.342 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:07:51.342 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@17 -- # CONFIG_MAX_NUMA_NODES=1 00:07:51.342 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@18 -- # CONFIG_PGO_CAPTURE=n 00:07:51.342 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@19 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:07:51.342 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@20 -- # CONFIG_ENV=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:07:51.342 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@21 -- # CONFIG_LTO=n 00:07:51.342 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@22 -- # CONFIG_ISCSI_INITIATOR=y 00:07:51.342 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@23 -- # CONFIG_CET=n 00:07:51.342 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@24 -- # CONFIG_VBDEV_COMPRESS_MLX5=n 00:07:51.342 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@25 -- # CONFIG_OCF_PATH= 00:07:51.342 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@26 -- # CONFIG_RDMA_SET_TOS=y 00:07:51.342 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@27 -- # CONFIG_AIO_FSDEV=y 00:07:51.342 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@28 -- # CONFIG_HAVE_ARC4RANDOM=y 00:07:51.342 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@29 -- # CONFIG_HAVE_LIBARCHIVE=n 00:07:51.342 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@30 -- # CONFIG_UBLK=y 00:07:51.342 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@31 -- # CONFIG_ISAL_CRYPTO=y 00:07:51.342 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@32 -- # CONFIG_OPENSSL_PATH= 00:07:51.342 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@33 -- # CONFIG_OCF=n 00:07:51.342 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@34 -- # CONFIG_FUSE=n 00:07:51.342 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@35 -- # CONFIG_VTUNE_DIR= 00:07:51.342 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@36 -- # CONFIG_FUZZER_LIB=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:07:51.342 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@37 -- # CONFIG_FUZZER=y 00:07:51.342 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@38 -- # CONFIG_FSDEV=y 00:07:51.342 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@39 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:07:51.342 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@40 -- # CONFIG_CRYPTO=n 00:07:51.342 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@41 -- # CONFIG_PGO_USE=n 00:07:51.342 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@42 -- # CONFIG_VHOST=y 00:07:51.342 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@43 -- # CONFIG_DAOS=n 00:07:51.342 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@44 -- # CONFIG_DPDK_INC_DIR=//var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:07:51.342 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@45 -- # CONFIG_DAOS_DIR= 00:07:51.342 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@46 -- # CONFIG_UNIT_TESTS=n 00:07:51.342 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@47 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:07:51.342 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@48 -- # CONFIG_VIRTIO=y 00:07:51.342 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@49 -- # CONFIG_DPDK_UADK=n 00:07:51.342 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@50 -- # CONFIG_COVERAGE=y 00:07:51.342 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@51 -- # CONFIG_RDMA=y 00:07:51.342 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@52 -- # CONFIG_HAVE_STRUCT_STAT_ST_ATIM=y 00:07:51.342 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@53 -- # CONFIG_HAVE_LZ4=n 00:07:51.342 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@54 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:07:51.342 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@55 -- # CONFIG_URING_PATH= 00:07:51.342 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@56 -- # CONFIG_XNVME=n 00:07:51.342 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@57 -- # CONFIG_VFIO_USER=y 00:07:51.342 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@58 -- # CONFIG_ARCH=native 00:07:51.342 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@59 -- # CONFIG_HAVE_EVP_MAC=y 00:07:51.342 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@60 -- # CONFIG_URING_ZNS=n 00:07:51.342 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@61 -- # CONFIG_WERROR=y 00:07:51.342 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@62 -- # CONFIG_HAVE_LIBBSD=n 00:07:51.342 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@63 -- # CONFIG_UBSAN=y 00:07:51.342 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@64 -- # CONFIG_HAVE_STRUCT_STAT_ST_ATIMESPEC=n 00:07:51.342 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@65 -- # CONFIG_IPSEC_MB_DIR= 00:07:51.342 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@66 -- # CONFIG_GOLANG=n 00:07:51.342 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@67 -- # CONFIG_ISAL=y 00:07:51.342 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@68 -- # CONFIG_IDXD_KERNEL=y 00:07:51.342 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@69 -- # CONFIG_DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:07:51.342 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@70 -- # CONFIG_RDMA_PROV=verbs 00:07:51.342 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@71 -- # CONFIG_APPS=y 00:07:51.342 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@72 -- # CONFIG_SHARED=n 00:07:51.342 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@73 -- # CONFIG_HAVE_KEYUTILS=y 00:07:51.342 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@74 -- # CONFIG_FC_PATH= 00:07:51.342 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@75 -- # CONFIG_DPDK_PKG_CONFIG=n 00:07:51.342 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@76 -- # CONFIG_FC=n 00:07:51.342 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@77 -- # CONFIG_AVAHI=n 00:07:51.342 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@78 -- # CONFIG_FIO_PLUGIN=y 00:07:51.342 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@79 -- # CONFIG_RAID5F=n 00:07:51.342 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@80 -- # CONFIG_EXAMPLES=y 00:07:51.342 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@81 -- # CONFIG_TESTS=y 00:07:51.342 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@82 -- # CONFIG_CRYPTO_MLX5=n 00:07:51.342 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@83 -- # CONFIG_MAX_LCORES=128 00:07:51.342 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@84 -- # CONFIG_IPSEC_MB=n 00:07:51.342 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@85 -- # CONFIG_PGO_DIR= 00:07:51.342 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@86 -- # CONFIG_DEBUG=y 00:07:51.342 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@87 -- # CONFIG_DPDK_COMPRESSDEV=n 00:07:51.342 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@88 -- # CONFIG_CROSS_PREFIX= 00:07:51.342 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@89 -- # CONFIG_COPY_FILE_RANGE=y 00:07:51.342 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@90 -- # CONFIG_URING=n 00:07:51.342 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@54 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:07:51.342 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:07:51.342 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:07:51.342 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:07:51.342 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:07:51.342 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:07:51.342 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:07:51.342 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:07:51.342 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:07:51.342 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:07:51.342 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:07:51.342 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:07:51.342 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:07:51.342 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:07:51.342 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/config.h ]] 00:07:51.342 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:07:51.342 #define SPDK_CONFIG_H 00:07:51.342 #define SPDK_CONFIG_AIO_FSDEV 1 00:07:51.342 #define SPDK_CONFIG_APPS 1 00:07:51.342 #define SPDK_CONFIG_ARCH native 00:07:51.342 #undef SPDK_CONFIG_ASAN 00:07:51.342 #undef SPDK_CONFIG_AVAHI 00:07:51.342 #undef SPDK_CONFIG_CET 00:07:51.342 #define SPDK_CONFIG_COPY_FILE_RANGE 1 00:07:51.342 #define SPDK_CONFIG_COVERAGE 1 00:07:51.342 #define SPDK_CONFIG_CROSS_PREFIX 00:07:51.342 #undef SPDK_CONFIG_CRYPTO 00:07:51.342 #undef SPDK_CONFIG_CRYPTO_MLX5 00:07:51.342 #undef SPDK_CONFIG_CUSTOMOCF 00:07:51.343 #undef SPDK_CONFIG_DAOS 00:07:51.343 #define SPDK_CONFIG_DAOS_DIR 00:07:51.343 #define SPDK_CONFIG_DEBUG 1 00:07:51.343 #undef SPDK_CONFIG_DPDK_COMPRESSDEV 00:07:51.343 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:07:51.343 #define SPDK_CONFIG_DPDK_INC_DIR //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:07:51.343 #define SPDK_CONFIG_DPDK_LIB_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:07:51.343 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:07:51.343 #undef SPDK_CONFIG_DPDK_UADK 00:07:51.343 #define SPDK_CONFIG_ENV /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:07:51.343 #define SPDK_CONFIG_EXAMPLES 1 00:07:51.343 #undef SPDK_CONFIG_FC 00:07:51.343 #define SPDK_CONFIG_FC_PATH 00:07:51.343 #define SPDK_CONFIG_FIO_PLUGIN 1 00:07:51.343 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:07:51.343 #define SPDK_CONFIG_FSDEV 1 00:07:51.343 #undef SPDK_CONFIG_FUSE 00:07:51.343 #define SPDK_CONFIG_FUZZER 1 00:07:51.343 #define SPDK_CONFIG_FUZZER_LIB /usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:07:51.343 #undef SPDK_CONFIG_GOLANG 00:07:51.343 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:07:51.343 #define SPDK_CONFIG_HAVE_EVP_MAC 1 00:07:51.343 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:07:51.343 #define SPDK_CONFIG_HAVE_KEYUTILS 1 00:07:51.343 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:07:51.343 #undef SPDK_CONFIG_HAVE_LIBBSD 00:07:51.343 #undef SPDK_CONFIG_HAVE_LZ4 00:07:51.343 #define SPDK_CONFIG_HAVE_STRUCT_STAT_ST_ATIM 1 00:07:51.343 #undef SPDK_CONFIG_HAVE_STRUCT_STAT_ST_ATIMESPEC 00:07:51.343 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:07:51.343 #define SPDK_CONFIG_IDXD 1 00:07:51.343 #define SPDK_CONFIG_IDXD_KERNEL 1 00:07:51.343 #undef SPDK_CONFIG_IPSEC_MB 00:07:51.343 #define SPDK_CONFIG_IPSEC_MB_DIR 00:07:51.343 #define SPDK_CONFIG_ISAL 1 00:07:51.343 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:07:51.343 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:07:51.343 #define SPDK_CONFIG_LIBDIR 00:07:51.343 #undef SPDK_CONFIG_LTO 00:07:51.343 #define SPDK_CONFIG_MAX_LCORES 128 00:07:51.343 #define SPDK_CONFIG_MAX_NUMA_NODES 1 00:07:51.343 #define SPDK_CONFIG_NVME_CUSE 1 00:07:51.343 #undef SPDK_CONFIG_OCF 00:07:51.343 #define SPDK_CONFIG_OCF_PATH 00:07:51.343 #define SPDK_CONFIG_OPENSSL_PATH 00:07:51.343 #undef SPDK_CONFIG_PGO_CAPTURE 00:07:51.343 #define SPDK_CONFIG_PGO_DIR 00:07:51.343 #undef SPDK_CONFIG_PGO_USE 00:07:51.343 #define SPDK_CONFIG_PREFIX /usr/local 00:07:51.343 #undef SPDK_CONFIG_RAID5F 00:07:51.343 #undef SPDK_CONFIG_RBD 00:07:51.343 #define SPDK_CONFIG_RDMA 1 00:07:51.343 #define SPDK_CONFIG_RDMA_PROV verbs 00:07:51.343 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:07:51.343 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:07:51.343 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:07:51.343 #undef SPDK_CONFIG_SHARED 00:07:51.343 #undef SPDK_CONFIG_SMA 00:07:51.343 #define SPDK_CONFIG_TESTS 1 00:07:51.343 #undef SPDK_CONFIG_TSAN 00:07:51.343 #define SPDK_CONFIG_UBLK 1 00:07:51.343 #define SPDK_CONFIG_UBSAN 1 00:07:51.343 #undef SPDK_CONFIG_UNIT_TESTS 00:07:51.343 #undef SPDK_CONFIG_URING 00:07:51.343 #define SPDK_CONFIG_URING_PATH 00:07:51.343 #undef SPDK_CONFIG_URING_ZNS 00:07:51.343 #undef SPDK_CONFIG_USDT 00:07:51.343 #undef SPDK_CONFIG_VBDEV_COMPRESS 00:07:51.343 #undef SPDK_CONFIG_VBDEV_COMPRESS_MLX5 00:07:51.343 #define SPDK_CONFIG_VFIO_USER 1 00:07:51.343 #define SPDK_CONFIG_VFIO_USER_DIR 00:07:51.343 #define SPDK_CONFIG_VHOST 1 00:07:51.343 #define SPDK_CONFIG_VIRTIO 1 00:07:51.343 #undef SPDK_CONFIG_VTUNE 00:07:51.343 #define SPDK_CONFIG_VTUNE_DIR 00:07:51.343 #define SPDK_CONFIG_WERROR 1 00:07:51.343 #define SPDK_CONFIG_WPDK_DIR 00:07:51.343 #undef SPDK_CONFIG_XNVME 00:07:51.343 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:07:51.343 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:07:51.343 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@55 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:07:51.343 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@15 -- # shopt -s extglob 00:07:51.343 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:07:51.343 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:07:51.343 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:07:51.343 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:51.343 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:51.343 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:51.343 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- paths/export.sh@5 -- # export PATH 00:07:51.343 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:51.343 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@56 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:07:51.343 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@6 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:07:51.343 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@6 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:07:51.343 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:07:51.343 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@7 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/../../../ 00:07:51.343 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:07:51.343 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@64 -- # TEST_TAG=N/A 00:07:51.343 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@65 -- # TEST_TAG_FILE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.run_test_name 00:07:51.343 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@67 -- # PM_OUTPUTDIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:07:51.343 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@68 -- # uname -s 00:07:51.343 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@68 -- # PM_OS=Linux 00:07:51.343 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@70 -- # MONITOR_RESOURCES_SUDO=() 00:07:51.343 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@70 -- # declare -A MONITOR_RESOURCES_SUDO 00:07:51.343 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@71 -- # MONITOR_RESOURCES_SUDO["collect-bmc-pm"]=1 00:07:51.343 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@72 -- # MONITOR_RESOURCES_SUDO["collect-cpu-load"]=0 00:07:51.343 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@73 -- # MONITOR_RESOURCES_SUDO["collect-cpu-temp"]=0 00:07:51.343 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@74 -- # MONITOR_RESOURCES_SUDO["collect-vmstat"]=0 00:07:51.343 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@76 -- # SUDO[0]= 00:07:51.343 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@76 -- # SUDO[1]='sudo -E' 00:07:51.343 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@78 -- # MONITOR_RESOURCES=(collect-cpu-load collect-vmstat) 00:07:51.343 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@79 -- # [[ Linux == FreeBSD ]] 00:07:51.343 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@81 -- # [[ Linux == Linux ]] 00:07:51.343 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@81 -- # [[ ............................... != QEMU ]] 00:07:51.343 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@81 -- # [[ ! -e /.dockerenv ]] 00:07:51.343 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@84 -- # MONITOR_RESOURCES+=(collect-cpu-temp) 00:07:51.343 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@85 -- # MONITOR_RESOURCES+=(collect-bmc-pm) 00:07:51.343 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@88 -- # [[ ! -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power ]] 00:07:51.343 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@58 -- # : 1 00:07:51.343 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@59 -- # export RUN_NIGHTLY 00:07:51.343 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@62 -- # : 0 00:07:51.343 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@63 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:07:51.343 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@64 -- # : 0 00:07:51.343 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@65 -- # export SPDK_RUN_VALGRIND 00:07:51.343 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@66 -- # : 1 00:07:51.343 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@67 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:07:51.343 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@68 -- # : 0 00:07:51.343 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@69 -- # export SPDK_TEST_UNITTEST 00:07:51.343 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@70 -- # : 00:07:51.343 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@71 -- # export SPDK_TEST_AUTOBUILD 00:07:51.343 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@72 -- # : 0 00:07:51.343 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@73 -- # export SPDK_TEST_RELEASE_BUILD 00:07:51.343 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@74 -- # : 0 00:07:51.343 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@75 -- # export SPDK_TEST_ISAL 00:07:51.343 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@76 -- # : 0 00:07:51.343 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@77 -- # export SPDK_TEST_ISCSI 00:07:51.343 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@78 -- # : 0 00:07:51.343 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@79 -- # export SPDK_TEST_ISCSI_INITIATOR 00:07:51.343 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@80 -- # : 0 00:07:51.343 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME 00:07:51.344 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@82 -- # : 0 00:07:51.344 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_PMR 00:07:51.344 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@84 -- # : 0 00:07:51.344 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_BP 00:07:51.344 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@86 -- # : 0 00:07:51.344 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVME_CLI 00:07:51.344 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@88 -- # : 0 00:07:51.344 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@89 -- # export SPDK_TEST_NVME_CUSE 00:07:51.344 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@90 -- # : 0 00:07:51.344 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@91 -- # export SPDK_TEST_NVME_FDP 00:07:51.344 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@92 -- # : 0 00:07:51.344 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@93 -- # export SPDK_TEST_NVMF 00:07:51.344 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@94 -- # : 0 00:07:51.344 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@95 -- # export SPDK_TEST_VFIOUSER 00:07:51.344 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@96 -- # : 0 00:07:51.344 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@97 -- # export SPDK_TEST_VFIOUSER_QEMU 00:07:51.344 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@98 -- # : 1 00:07:51.344 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@99 -- # export SPDK_TEST_FUZZER 00:07:51.344 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@100 -- # : 1 00:07:51.344 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@101 -- # export SPDK_TEST_FUZZER_SHORT 00:07:51.344 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@102 -- # : rdma 00:07:51.344 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@103 -- # export SPDK_TEST_NVMF_TRANSPORT 00:07:51.344 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@104 -- # : 0 00:07:51.344 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@105 -- # export SPDK_TEST_RBD 00:07:51.344 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@106 -- # : 0 00:07:51.344 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@107 -- # export SPDK_TEST_VHOST 00:07:51.344 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@108 -- # : 0 00:07:51.344 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@109 -- # export SPDK_TEST_BLOCKDEV 00:07:51.344 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@110 -- # : 0 00:07:51.344 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@111 -- # export SPDK_TEST_RAID 00:07:51.344 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@112 -- # : 0 00:07:51.344 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@113 -- # export SPDK_TEST_IOAT 00:07:51.344 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@114 -- # : 0 00:07:51.344 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@115 -- # export SPDK_TEST_BLOBFS 00:07:51.344 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@116 -- # : 0 00:07:51.344 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@117 -- # export SPDK_TEST_VHOST_INIT 00:07:51.344 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@118 -- # : 0 00:07:51.344 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@119 -- # export SPDK_TEST_LVOL 00:07:51.344 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@120 -- # : 0 00:07:51.344 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@121 -- # export SPDK_TEST_VBDEV_COMPRESS 00:07:51.344 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@122 -- # : 0 00:07:51.344 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@123 -- # export SPDK_RUN_ASAN 00:07:51.344 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@124 -- # : 1 00:07:51.344 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@125 -- # export SPDK_RUN_UBSAN 00:07:51.344 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@126 -- # : /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:07:51.344 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@127 -- # export SPDK_RUN_EXTERNAL_DPDK 00:07:51.344 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@128 -- # : 0 00:07:51.344 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@129 -- # export SPDK_RUN_NON_ROOT 00:07:51.344 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@130 -- # : 0 00:07:51.344 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@131 -- # export SPDK_TEST_CRYPTO 00:07:51.344 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@132 -- # : 0 00:07:51.344 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@133 -- # export SPDK_TEST_FTL 00:07:51.344 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@134 -- # : 0 00:07:51.344 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@135 -- # export SPDK_TEST_OCF 00:07:51.344 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@136 -- # : 0 00:07:51.344 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@137 -- # export SPDK_TEST_VMD 00:07:51.344 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@138 -- # : 0 00:07:51.344 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@139 -- # export SPDK_TEST_OPAL 00:07:51.344 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@140 -- # : v23.11 00:07:51.344 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@141 -- # export SPDK_TEST_NATIVE_DPDK 00:07:51.344 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@142 -- # : true 00:07:51.344 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@143 -- # export SPDK_AUTOTEST_X 00:07:51.344 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@144 -- # : 0 00:07:51.344 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@145 -- # export SPDK_TEST_URING 00:07:51.344 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@146 -- # : 0 00:07:51.344 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@147 -- # export SPDK_TEST_USDT 00:07:51.344 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@148 -- # : 0 00:07:51.344 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@149 -- # export SPDK_TEST_USE_IGB_UIO 00:07:51.344 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@150 -- # : 0 00:07:51.344 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@151 -- # export SPDK_TEST_SCHEDULER 00:07:51.344 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@152 -- # : 0 00:07:51.344 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@153 -- # export SPDK_TEST_SCANBUILD 00:07:51.344 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@154 -- # : 00:07:51.344 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@155 -- # export SPDK_TEST_NVMF_NICS 00:07:51.344 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@156 -- # : 0 00:07:51.344 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@157 -- # export SPDK_TEST_SMA 00:07:51.344 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@158 -- # : 0 00:07:51.344 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@159 -- # export SPDK_TEST_DAOS 00:07:51.344 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@160 -- # : 0 00:07:51.344 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@161 -- # export SPDK_TEST_XNVME 00:07:51.344 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@162 -- # : 0 00:07:51.344 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@163 -- # export SPDK_TEST_ACCEL 00:07:51.344 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@164 -- # : 0 00:07:51.344 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@165 -- # export SPDK_TEST_ACCEL_DSA 00:07:51.344 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@166 -- # : 0 00:07:51.344 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@167 -- # export SPDK_TEST_ACCEL_IAA 00:07:51.344 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@169 -- # : 00:07:51.344 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@170 -- # export SPDK_TEST_FUZZER_TARGET 00:07:51.344 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@171 -- # : 0 00:07:51.344 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@172 -- # export SPDK_TEST_NVMF_MDNS 00:07:51.344 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@173 -- # : 0 00:07:51.344 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@174 -- # export SPDK_JSONRPC_GO_CLIENT 00:07:51.344 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@175 -- # : 1 00:07:51.344 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@176 -- # export SPDK_TEST_SETUP 00:07:51.344 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@177 -- # : 0 00:07:51.344 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@178 -- # export SPDK_TEST_NVME_INTERRUPT 00:07:51.344 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@181 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:07:51.344 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@181 -- # SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:07:51.344 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@182 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:07:51.344 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@182 -- # DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:07:51.344 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@183 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:51.344 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@183 -- # VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:51.344 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@184 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:51.345 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@184 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:51.345 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@187 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:07:51.345 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@187 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:07:51.345 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@191 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:07:51.345 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@191 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:07:51.345 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@195 -- # export PYTHONDONTWRITEBYTECODE=1 00:07:51.345 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@195 -- # PYTHONDONTWRITEBYTECODE=1 00:07:51.345 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@199 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:07:51.345 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@199 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:07:51.345 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@200 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:07:51.345 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@200 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:07:51.345 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@204 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:07:51.345 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@205 -- # rm -rf /var/tmp/asan_suppression_file 00:07:51.345 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@206 -- # cat 00:07:51.345 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@242 -- # echo leak:libfuse3.so 00:07:51.345 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@244 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:07:51.345 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@244 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:07:51.345 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@246 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:07:51.345 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@246 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:07:51.345 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@248 -- # '[' -z /var/spdk/dependencies ']' 00:07:51.345 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@251 -- # export DEPENDENCY_DIR 00:07:51.345 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@255 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:07:51.345 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@255 -- # SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:07:51.345 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@256 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:07:51.345 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@256 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:07:51.345 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@259 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:07:51.345 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@259 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:07:51.345 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@260 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:07:51.345 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@260 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:07:51.345 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@262 -- # export AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:07:51.345 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@262 -- # AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:07:51.345 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@265 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:07:51.345 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@265 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:07:51.345 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@267 -- # _LCOV_MAIN=0 00:07:51.345 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@268 -- # _LCOV_LLVM=1 00:07:51.345 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@269 -- # _LCOV= 00:07:51.345 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@270 -- # [[ '' == *clang* ]] 00:07:51.345 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@270 -- # [[ 1 -eq 1 ]] 00:07:51.345 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@270 -- # _LCOV=1 00:07:51.345 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@272 -- # _lcov_opt[_LCOV_LLVM]='--gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh' 00:07:51.345 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@273 -- # _lcov_opt[_LCOV_MAIN]= 00:07:51.345 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@275 -- # lcov_opt='--gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh' 00:07:51.345 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@278 -- # '[' 0 -eq 0 ']' 00:07:51.345 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@279 -- # export valgrind= 00:07:51.345 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@279 -- # valgrind= 00:07:51.345 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@285 -- # uname -s 00:07:51.345 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@285 -- # '[' Linux = Linux ']' 00:07:51.345 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@286 -- # HUGEMEM=4096 00:07:51.345 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@287 -- # export CLEAR_HUGE=yes 00:07:51.345 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@287 -- # CLEAR_HUGE=yes 00:07:51.345 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@289 -- # MAKE=make 00:07:51.345 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@290 -- # MAKEFLAGS=-j112 00:07:51.345 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@306 -- # export HUGEMEM=4096 00:07:51.345 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@306 -- # HUGEMEM=4096 00:07:51.345 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@308 -- # NO_HUGE=() 00:07:51.345 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@309 -- # TEST_MODE= 00:07:51.345 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@331 -- # [[ -z 321974 ]] 00:07:51.345 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@331 -- # kill -0 321974 00:07:51.345 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1678 -- # set_test_storage 2147483648 00:07:51.345 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@341 -- # [[ -v testdir ]] 00:07:51.345 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@343 -- # local requested_size=2147483648 00:07:51.345 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@344 -- # local mount target_dir 00:07:51.345 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@346 -- # local -A mounts fss sizes avails uses 00:07:51.345 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@347 -- # local source fs size avail mount use 00:07:51.345 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@349 -- # local storage_fallback storage_candidates 00:07:51.345 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@351 -- # mktemp -udt spdk.XXXXXX 00:07:51.345 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@351 -- # storage_fallback=/tmp/spdk.Ylr72c 00:07:51.345 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@356 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:07:51.345 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@358 -- # [[ -n '' ]] 00:07:51.345 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@363 -- # [[ -n '' ]] 00:07:51.345 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@368 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf /tmp/spdk.Ylr72c/tests/nvmf /tmp/spdk.Ylr72c 00:07:51.345 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@371 -- # requested_size=2214592512 00:07:51.345 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:07:51.345 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@340 -- # df -T 00:07:51.345 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@340 -- # grep -v Filesystem 00:07:51.345 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=spdk_devtmpfs 00:07:51.345 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=devtmpfs 00:07:51.345 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=67108864 00:07:51.345 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=67108864 00:07:51.345 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=0 00:07:51.345 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:07:51.345 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=/dev/pmem0 00:07:51.345 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=ext2 00:07:51.345 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=4096 00:07:51.345 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=5284429824 00:07:51.345 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=5284425728 00:07:51.345 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:07:51.345 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=spdk_root 00:07:51.345 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=overlay 00:07:51.345 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=52377444352 00:07:51.345 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=61730582528 00:07:51.345 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=9353138176 00:07:51.345 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:07:51.345 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=tmpfs 00:07:51.345 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=tmpfs 00:07:51.345 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=30861860864 00:07:51.346 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=30865289216 00:07:51.346 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=3428352 00:07:51.346 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:07:51.346 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=tmpfs 00:07:51.346 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=tmpfs 00:07:51.346 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=12340117504 00:07:51.346 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=12346118144 00:07:51.346 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=6000640 00:07:51.346 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:07:51.346 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=tmpfs 00:07:51.346 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=tmpfs 00:07:51.346 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=30865113088 00:07:51.346 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=30865293312 00:07:51.346 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=180224 00:07:51.346 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:07:51.346 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=tmpfs 00:07:51.346 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=tmpfs 00:07:51.346 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=6173044736 00:07:51.346 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=6173057024 00:07:51.346 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=12288 00:07:51.346 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:07:51.346 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@379 -- # printf '* Looking for test storage...\n' 00:07:51.346 * Looking for test storage... 00:07:51.346 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@381 -- # local target_space new_size 00:07:51.346 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@382 -- # for target_dir in "${storage_candidates[@]}" 00:07:51.346 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@385 -- # df /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:51.346 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@385 -- # awk '$1 !~ /Filesystem/{print $6}' 00:07:51.346 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@385 -- # mount=/ 00:07:51.346 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@387 -- # target_space=52377444352 00:07:51.346 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@388 -- # (( target_space == 0 || target_space < requested_size )) 00:07:51.346 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@391 -- # (( target_space >= requested_size )) 00:07:51.346 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@393 -- # [[ overlay == tmpfs ]] 00:07:51.346 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@393 -- # [[ overlay == ramfs ]] 00:07:51.346 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@393 -- # [[ / == / ]] 00:07:51.346 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@394 -- # new_size=11567730688 00:07:51.346 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@395 -- # (( new_size * 100 / sizes[/] > 95 )) 00:07:51.346 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@400 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:51.346 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@400 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:51.346 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@401 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:51.346 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:51.346 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@402 -- # return 0 00:07:51.346 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1680 -- # set -o errtrace 00:07:51.346 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1681 -- # shopt -s extdebug 00:07:51.346 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1682 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:07:51.346 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1684 -- # PS4=' \t ${test_domain:-} -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:07:51.346 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1685 -- # true 00:07:51.346 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1687 -- # xtrace_fd 00:07:51.346 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@25 -- # [[ -n 14 ]] 00:07:51.346 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/14 ]] 00:07:51.346 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@27 -- # exec 00:07:51.346 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@29 -- # exec 00:07:51.346 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@31 -- # xtrace_restore 00:07:51.346 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:07:51.346 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:07:51.346 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@18 -- # set -x 00:07:51.346 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:07:51.346 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1693 -- # lcov --version 00:07:51.346 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:07:51.607 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:07:51.607 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:51.607 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:51.607 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:51.607 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@336 -- # IFS=.-: 00:07:51.607 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@336 -- # read -ra ver1 00:07:51.607 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@337 -- # IFS=.-: 00:07:51.607 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@337 -- # read -ra ver2 00:07:51.607 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@338 -- # local 'op=<' 00:07:51.607 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@340 -- # ver1_l=2 00:07:51.607 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@341 -- # ver2_l=1 00:07:51.607 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:51.607 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@344 -- # case "$op" in 00:07:51.607 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@345 -- # : 1 00:07:51.607 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:51.607 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:51.607 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@365 -- # decimal 1 00:07:51.607 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@353 -- # local d=1 00:07:51.607 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:51.607 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@355 -- # echo 1 00:07:51.607 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@365 -- # ver1[v]=1 00:07:51.607 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@366 -- # decimal 2 00:07:51.607 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@353 -- # local d=2 00:07:51.607 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:51.607 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@355 -- # echo 2 00:07:51.607 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@366 -- # ver2[v]=2 00:07:51.607 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:51.607 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:51.608 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@368 -- # return 0 00:07:51.608 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:51.608 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:07:51.608 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:51.608 --rc genhtml_branch_coverage=1 00:07:51.608 --rc genhtml_function_coverage=1 00:07:51.608 --rc genhtml_legend=1 00:07:51.608 --rc geninfo_all_blocks=1 00:07:51.608 --rc geninfo_unexecuted_blocks=1 00:07:51.608 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:51.608 ' 00:07:51.608 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:07:51.608 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:51.608 --rc genhtml_branch_coverage=1 00:07:51.608 --rc genhtml_function_coverage=1 00:07:51.608 --rc genhtml_legend=1 00:07:51.608 --rc geninfo_all_blocks=1 00:07:51.608 --rc geninfo_unexecuted_blocks=1 00:07:51.608 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:51.608 ' 00:07:51.608 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:07:51.608 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:51.608 --rc genhtml_branch_coverage=1 00:07:51.608 --rc genhtml_function_coverage=1 00:07:51.608 --rc genhtml_legend=1 00:07:51.608 --rc geninfo_all_blocks=1 00:07:51.608 --rc geninfo_unexecuted_blocks=1 00:07:51.608 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:51.608 ' 00:07:51.608 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:07:51.608 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:51.608 --rc genhtml_branch_coverage=1 00:07:51.608 --rc genhtml_function_coverage=1 00:07:51.608 --rc genhtml_legend=1 00:07:51.608 --rc geninfo_all_blocks=1 00:07:51.608 --rc geninfo_unexecuted_blocks=1 00:07:51.608 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:51.608 ' 00:07:51.608 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@61 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/../common.sh 00:07:51.608 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@8 -- # pids=() 00:07:51.608 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@63 -- # fuzzfile=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c 00:07:51.608 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@64 -- # grep -c '\.fn =' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c 00:07:51.608 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@64 -- # fuzz_num=25 00:07:51.608 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@65 -- # (( fuzz_num != 0 )) 00:07:51.608 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@67 -- # trap 'cleanup /tmp/llvm_fuzz* /var/tmp/suppress_nvmf_fuzz; exit 1' SIGINT SIGTERM EXIT 00:07:51.608 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@69 -- # mem_size=512 00:07:51.608 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@70 -- # [[ 1 -eq 1 ]] 00:07:51.608 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@71 -- # start_llvm_fuzz_short 25 1 00:07:51.608 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@69 -- # local fuzz_num=25 00:07:51.608 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@70 -- # local time=1 00:07:51.608 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i = 0 )) 00:07:51.608 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:51.608 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 0 1 0x1 00:07:51.608 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=0 00:07:51.608 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:07:51.608 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:07:51.608 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:07:51.608 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_0.conf 00:07:51.608 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:51.608 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:51.608 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 0 00:07:51.608 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4400 00:07:51.608 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:07:51.608 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4400' 00:07:51.608 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4400"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:51.608 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:51.608 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:51.608 14:20:47 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4400' -c /tmp/fuzz_json_0.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 -Z 0 00:07:51.608 [2024-11-18 14:20:47.566653] Starting SPDK v25.01-pre git sha1 d47eb51c9 / DPDK 23.11.0 initialization... 00:07:51.608 [2024-11-18 14:20:47.566728] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid322036 ] 00:07:51.868 [2024-11-18 14:20:47.781900] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:51.868 [2024-11-18 14:20:47.794700] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:51.868 [2024-11-18 14:20:47.847136] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:51.868 [2024-11-18 14:20:47.863458] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4400 *** 00:07:51.868 INFO: Running with entropic power schedule (0xFF, 100). 00:07:51.868 INFO: Seed: 3412646481 00:07:51.868 INFO: Loaded 1 modules (387495 inline 8-bit counters): 387495 [0x2acbe4c, 0x2b2a7f3), 00:07:51.868 INFO: Loaded 1 PC tables (387495 PCs): 387495 [0x2b2a7f8,0x3114268), 00:07:51.868 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:07:51.868 INFO: A corpus is not provided, starting from an empty corpus 00:07:51.868 #2 INITED exec/s: 0 rss: 64Mb 00:07:51.868 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:51.868 This may also happen if the target rejected all inputs we tried so far 00:07:51.868 [2024-11-18 14:20:47.922105] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (44) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:51.868 [2024-11-18 14:20:47.922135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.128 NEW_FUNC[1/713]: 0x459648 in fuzz_admin_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:47 00:07:52.128 NEW_FUNC[2/713]: 0x495e28 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:52.128 #5 NEW cov: 12153 ft: 12167 corp: 2/113b lim: 320 exec/s: 0 rss: 72Mb L: 112/112 MS: 3 ShuffleBytes-ChangeByte-InsertRepeatedBytes- 00:07:52.128 [2024-11-18 14:20:48.253056] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (af) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.128 [2024-11-18 14:20:48.253109] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.388 NEW_FUNC[1/3]: 0x154e0f8 in nvmf_tcp_req_set_cpl /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/tcp.c:2213 00:07:52.388 NEW_FUNC[2/3]: 0x1983f08 in nvme_get_sgl /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_qpair.c:159 00:07:52.388 #19 NEW cov: 12339 ft: 13013 corp: 3/200b lim: 320 exec/s: 0 rss: 72Mb L: 87/112 MS: 4 ChangeBit-ChangeByte-ShuffleBytes-InsertRepeatedBytes- 00:07:52.388 [2024-11-18 14:20:48.303014] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (44) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:52.388 [2024-11-18 14:20:48.303041] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.388 #20 NEW cov: 12345 ft: 13140 corp: 4/275b lim: 320 exec/s: 0 rss: 72Mb L: 75/112 MS: 1 EraseBytes- 00:07:52.388 [2024-11-18 14:20:48.363188] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (44) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:52.388 [2024-11-18 14:20:48.363215] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.388 #21 NEW cov: 12430 ft: 13415 corp: 5/350b lim: 320 exec/s: 0 rss: 72Mb L: 75/112 MS: 1 ChangeBinInt- 00:07:52.388 [2024-11-18 14:20:48.423326] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (44) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:52.388 [2024-11-18 14:20:48.423353] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.388 #22 NEW cov: 12430 ft: 13478 corp: 6/425b lim: 320 exec/s: 0 rss: 72Mb L: 75/112 MS: 1 ChangeByte- 00:07:52.388 [2024-11-18 14:20:48.483453] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (44) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:52.388 [2024-11-18 14:20:48.483485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.388 #23 NEW cov: 12430 ft: 13649 corp: 7/500b lim: 320 exec/s: 0 rss: 72Mb L: 75/112 MS: 1 ChangeBit- 00:07:52.648 [2024-11-18 14:20:48.523565] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (af) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.648 [2024-11-18 14:20:48.523592] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.648 #24 NEW cov: 12430 ft: 13686 corp: 8/585b lim: 320 exec/s: 0 rss: 72Mb L: 85/112 MS: 1 EraseBytes- 00:07:52.648 [2024-11-18 14:20:48.583773] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (44) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:52.648 [2024-11-18 14:20:48.583800] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.649 #25 NEW cov: 12430 ft: 13729 corp: 9/661b lim: 320 exec/s: 0 rss: 72Mb L: 76/112 MS: 1 InsertByte- 00:07:52.649 [2024-11-18 14:20:48.643927] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (44) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:52.649 [2024-11-18 14:20:48.643954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.649 #26 NEW cov: 12430 ft: 13762 corp: 10/736b lim: 320 exec/s: 0 rss: 72Mb L: 75/112 MS: 1 ChangeBinInt- 00:07:52.649 [2024-11-18 14:20:48.684050] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (44) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:52.649 [2024-11-18 14:20:48.684076] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.649 #27 NEW cov: 12430 ft: 13807 corp: 11/811b lim: 320 exec/s: 0 rss: 72Mb L: 75/112 MS: 1 ChangeBit- 00:07:52.649 [2024-11-18 14:20:48.724115] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:52.649 [2024-11-18 14:20:48.724141] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.649 #30 NEW cov: 12430 ft: 13872 corp: 12/922b lim: 320 exec/s: 0 rss: 72Mb L: 111/112 MS: 3 CopyPart-ShuffleBytes-InsertRepeatedBytes- 00:07:52.649 [2024-11-18 14:20:48.764261] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (44) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:52.649 [2024-11-18 14:20:48.764287] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.909 NEW_FUNC[1/1]: 0x1c52b68 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:07:52.909 #31 NEW cov: 12453 ft: 13889 corp: 13/997b lim: 320 exec/s: 0 rss: 73Mb L: 75/112 MS: 1 ShuffleBytes- 00:07:52.909 [2024-11-18 14:20:48.824429] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (44) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:52.909 [2024-11-18 14:20:48.824455] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.909 #32 NEW cov: 12453 ft: 13910 corp: 14/1072b lim: 320 exec/s: 0 rss: 73Mb L: 75/112 MS: 1 ChangeByte- 00:07:52.909 [2024-11-18 14:20:48.864547] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (44) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:52.909 [2024-11-18 14:20:48.864578] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.909 #33 NEW cov: 12453 ft: 13964 corp: 15/1148b lim: 320 exec/s: 0 rss: 73Mb L: 76/112 MS: 1 ShuffleBytes- 00:07:52.909 [2024-11-18 14:20:48.924716] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (44) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:52.909 [2024-11-18 14:20:48.924742] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.909 #34 NEW cov: 12453 ft: 13978 corp: 16/1223b lim: 320 exec/s: 34 rss: 73Mb L: 75/112 MS: 1 ShuffleBytes- 00:07:52.909 [2024-11-18 14:20:48.964812] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (44) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:52.909 [2024-11-18 14:20:48.964837] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.909 #35 NEW cov: 12453 ft: 14047 corp: 17/1315b lim: 320 exec/s: 35 rss: 73Mb L: 92/112 MS: 1 InsertRepeatedBytes- 00:07:52.909 [2024-11-18 14:20:49.004911] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (44) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:52.909 [2024-11-18 14:20:49.004936] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.909 #36 NEW cov: 12453 ft: 14056 corp: 18/1427b lim: 320 exec/s: 36 rss: 73Mb L: 112/112 MS: 1 ChangeBinInt- 00:07:53.169 [2024-11-18 14:20:49.045066] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (44) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:53.169 [2024-11-18 14:20:49.045092] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.169 #41 NEW cov: 12453 ft: 14072 corp: 19/1504b lim: 320 exec/s: 41 rss: 73Mb L: 77/112 MS: 5 EraseBytes-ShuffleBytes-CopyPart-ShuffleBytes-CopyPart- 00:07:53.169 [2024-11-18 14:20:49.105214] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (44) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:53.169 [2024-11-18 14:20:49.105240] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.169 #42 NEW cov: 12453 ft: 14087 corp: 20/1616b lim: 320 exec/s: 42 rss: 73Mb L: 112/112 MS: 1 CopyPart- 00:07:53.169 [2024-11-18 14:20:49.165372] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (44) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:53.169 [2024-11-18 14:20:49.165398] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.169 #43 NEW cov: 12453 ft: 14093 corp: 21/1728b lim: 320 exec/s: 43 rss: 73Mb L: 112/112 MS: 1 CrossOver- 00:07:53.169 [2024-11-18 14:20:49.205451] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (44) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:53.169 [2024-11-18 14:20:49.205477] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.169 #44 NEW cov: 12453 ft: 14117 corp: 22/1803b lim: 320 exec/s: 44 rss: 73Mb L: 75/112 MS: 1 ChangeByte- 00:07:53.169 [2024-11-18 14:20:49.265663] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (44) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:53.169 [2024-11-18 14:20:49.265689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.169 #45 NEW cov: 12453 ft: 14183 corp: 23/1878b lim: 320 exec/s: 45 rss: 73Mb L: 75/112 MS: 1 ChangeByte- 00:07:53.429 [2024-11-18 14:20:49.305744] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (44) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:53.429 [2024-11-18 14:20:49.305770] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.429 #46 NEW cov: 12453 ft: 14185 corp: 24/1953b lim: 320 exec/s: 46 rss: 73Mb L: 75/112 MS: 1 ChangeBinInt- 00:07:53.429 [2024-11-18 14:20:49.345850] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (44) qid:0 cid:4 nsid:0 cdw10:0000004b cdw11:00000000 00:07:53.429 [2024-11-18 14:20:49.345876] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.429 #47 NEW cov: 12453 ft: 14186 corp: 25/2070b lim: 320 exec/s: 47 rss: 73Mb L: 117/117 MS: 1 CrossOver- 00:07:53.429 [2024-11-18 14:20:49.386138] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (44) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:53.429 [2024-11-18 14:20:49.386167] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.429 [2024-11-18 14:20:49.386243] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:53.429 [2024-11-18 14:20:49.386263] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.430 [2024-11-18 14:20:49.386353] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:6 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:53.430 [2024-11-18 14:20:49.386370] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.430 #48 NEW cov: 12453 ft: 14437 corp: 26/2273b lim: 320 exec/s: 48 rss: 73Mb L: 203/203 MS: 1 InsertRepeatedBytes- 00:07:53.430 [2024-11-18 14:20:49.426096] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (44) qid:0 cid:4 nsid:6 cdw10:00000000 cdw11:00000000 00:07:53.430 [2024-11-18 14:20:49.426121] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.430 #49 NEW cov: 12453 ft: 14439 corp: 27/2348b lim: 320 exec/s: 49 rss: 73Mb L: 75/203 MS: 1 ChangeBinInt- 00:07:53.430 [2024-11-18 14:20:49.466225] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (44) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:0000004b 00:07:53.430 [2024-11-18 14:20:49.466252] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.430 #50 NEW cov: 12453 ft: 14463 corp: 28/2469b lim: 320 exec/s: 50 rss: 73Mb L: 121/203 MS: 1 CMP- DE: "\021\000\000\000"- 00:07:53.430 [2024-11-18 14:20:49.526356] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (44) qid:0 cid:4 nsid:0 cdw10:0000004b cdw11:00002000 00:07:53.430 [2024-11-18 14:20:49.526381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.430 #51 NEW cov: 12453 ft: 14502 corp: 29/2586b lim: 320 exec/s: 51 rss: 73Mb L: 117/203 MS: 1 ChangeByte- 00:07:53.690 [2024-11-18 14:20:49.566494] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:33333333 cdw11:33333333 00:07:53.690 [2024-11-18 14:20:49.566520] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.690 #53 NEW cov: 12454 ft: 14515 corp: 30/2711b lim: 320 exec/s: 53 rss: 73Mb L: 125/203 MS: 2 InsertRepeatedBytes-InsertRepeatedBytes- 00:07:53.690 [2024-11-18 14:20:49.606759] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (44) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:53.690 [2024-11-18 14:20:49.606785] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.690 [2024-11-18 14:20:49.606859] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:53.690 [2024-11-18 14:20:49.606879] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.690 [2024-11-18 14:20:49.606955] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:6 nsid:ffffffff cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:53.690 [2024-11-18 14:20:49.606973] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.690 #54 NEW cov: 12454 ft: 14552 corp: 31/2930b lim: 320 exec/s: 54 rss: 73Mb L: 219/219 MS: 1 InsertRepeatedBytes- 00:07:53.690 [2024-11-18 14:20:49.666786] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (44) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:53.690 [2024-11-18 14:20:49.666812] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.690 #55 NEW cov: 12454 ft: 14638 corp: 32/3023b lim: 320 exec/s: 55 rss: 73Mb L: 93/219 MS: 1 InsertByte- 00:07:53.690 [2024-11-18 14:20:49.726952] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (44) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:53.690 [2024-11-18 14:20:49.726979] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.690 #56 NEW cov: 12454 ft: 14660 corp: 33/3140b lim: 320 exec/s: 56 rss: 74Mb L: 117/219 MS: 1 CopyPart- 00:07:53.690 [2024-11-18 14:20:49.787096] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (44) qid:0 cid:4 nsid:80000 cdw10:00000000 cdw11:00000000 00:07:53.690 [2024-11-18 14:20:49.787123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.690 #57 NEW cov: 12454 ft: 14687 corp: 34/3215b lim: 320 exec/s: 57 rss: 74Mb L: 75/219 MS: 1 ChangeBit- 00:07:53.950 [2024-11-18 14:20:49.827221] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (44) qid:0 cid:4 nsid:0 cdw10:0000004b cdw11:00002000 00:07:53.950 [2024-11-18 14:20:49.827248] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.950 #58 NEW cov: 12454 ft: 14708 corp: 35/3332b lim: 320 exec/s: 58 rss: 74Mb L: 117/219 MS: 1 ChangeByte- 00:07:53.950 [2024-11-18 14:20:49.867467] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (44) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:53.950 [2024-11-18 14:20:49.867493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.950 [2024-11-18 14:20:49.867562] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:53.950 [2024-11-18 14:20:49.867581] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.950 #59 NEW cov: 12454 ft: 14839 corp: 36/3506b lim: 320 exec/s: 29 rss: 74Mb L: 174/219 MS: 1 CrossOver- 00:07:53.950 #59 DONE cov: 12454 ft: 14839 corp: 36/3506b lim: 320 exec/s: 29 rss: 74Mb 00:07:53.950 ###### Recommended dictionary. ###### 00:07:53.950 "\021\000\000\000" # Uses: 0 00:07:53.950 ###### End of recommended dictionary. ###### 00:07:53.950 Done 59 runs in 2 second(s) 00:07:53.950 14:20:50 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_0.conf /var/tmp/suppress_nvmf_fuzz 00:07:53.950 14:20:50 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:07:53.950 14:20:50 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:53.950 14:20:50 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 1 1 0x1 00:07:53.950 14:20:50 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=1 00:07:53.950 14:20:50 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:07:53.950 14:20:50 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:07:53.950 14:20:50 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:07:53.950 14:20:50 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_1.conf 00:07:53.950 14:20:50 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:53.950 14:20:50 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:53.950 14:20:50 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 1 00:07:53.950 14:20:50 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4401 00:07:53.950 14:20:50 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:07:53.950 14:20:50 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4401' 00:07:53.950 14:20:50 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4401"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:53.950 14:20:50 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:53.950 14:20:50 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:53.950 14:20:50 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4401' -c /tmp/fuzz_json_1.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 -Z 1 00:07:53.950 [2024-11-18 14:20:50.053026] Starting SPDK v25.01-pre git sha1 d47eb51c9 / DPDK 23.11.0 initialization... 00:07:53.950 [2024-11-18 14:20:50.053092] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid322486 ] 00:07:54.210 [2024-11-18 14:20:50.259667] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:54.210 [2024-11-18 14:20:50.272756] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:54.210 [2024-11-18 14:20:50.325507] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:54.470 [2024-11-18 14:20:50.341858] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4401 *** 00:07:54.470 INFO: Running with entropic power schedule (0xFF, 100). 00:07:54.470 INFO: Seed: 1597665628 00:07:54.470 INFO: Loaded 1 modules (387495 inline 8-bit counters): 387495 [0x2acbe4c, 0x2b2a7f3), 00:07:54.470 INFO: Loaded 1 PC tables (387495 PCs): 387495 [0x2b2a7f8,0x3114268), 00:07:54.470 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:07:54.470 INFO: A corpus is not provided, starting from an empty corpus 00:07:54.470 #2 INITED exec/s: 0 rss: 65Mb 00:07:54.470 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:54.470 This may also happen if the target rejected all inputs we tried so far 00:07:54.470 [2024-11-18 14:20:50.397083] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000a3a3 00:07:54.470 [2024-11-18 14:20:50.397203] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000a3a3 00:07:54.470 [2024-11-18 14:20:50.397311] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000a3a3 00:07:54.470 [2024-11-18 14:20:50.397520] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aa383a3 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.470 [2024-11-18 14:20:50.397554] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.470 [2024-11-18 14:20:50.397612] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:a3a383a3 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.470 [2024-11-18 14:20:50.397627] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.470 [2024-11-18 14:20:50.397680] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:a3a383a3 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.470 [2024-11-18 14:20:50.397693] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.731 NEW_FUNC[1/716]: 0x459f48 in fuzz_admin_get_log_page_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:67 00:07:54.731 NEW_FUNC[2/716]: 0x495e28 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:54.731 #3 NEW cov: 12268 ft: 12271 corp: 2/23b lim: 30 exec/s: 0 rss: 72Mb L: 22/22 MS: 1 InsertRepeatedBytes- 00:07:54.731 [2024-11-18 14:20:50.738466] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000a3a3 00:07:54.731 [2024-11-18 14:20:50.738594] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000a3a3 00:07:54.731 [2024-11-18 14:20:50.738708] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000a3a3 00:07:54.731 [2024-11-18 14:20:50.738962] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aa383a3 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.731 [2024-11-18 14:20:50.739013] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.731 [2024-11-18 14:20:50.739091] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:a3a383a3 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.731 [2024-11-18 14:20:50.739117] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.731 [2024-11-18 14:20:50.739193] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:a3a383aa cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.731 [2024-11-18 14:20:50.739217] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.731 #4 NEW cov: 12385 ft: 13101 corp: 3/46b lim: 30 exec/s: 0 rss: 72Mb L: 23/23 MS: 1 InsertByte- 00:07:54.731 [2024-11-18 14:20:50.808055] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000a3a3 00:07:54.731 [2024-11-18 14:20:50.808174] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000e0a3 00:07:54.731 [2024-11-18 14:20:50.808279] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000a3a3 00:07:54.731 [2024-11-18 14:20:50.808490] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aa383a3 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.731 [2024-11-18 14:20:50.808516] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.731 [2024-11-18 14:20:50.808571] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:a3a383a3 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.731 [2024-11-18 14:20:50.808585] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.731 [2024-11-18 14:20:50.808640] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:a3a383a3 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.731 [2024-11-18 14:20:50.808653] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.731 #5 NEW cov: 12391 ft: 13267 corp: 4/68b lim: 30 exec/s: 0 rss: 72Mb L: 22/23 MS: 1 ChangeByte- 00:07:54.731 [2024-11-18 14:20:50.848101] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:54.731 [2024-11-18 14:20:50.848217] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:54.731 [2024-11-18 14:20:50.848427] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.731 [2024-11-18 14:20:50.848453] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.731 [2024-11-18 14:20:50.848508] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.731 [2024-11-18 14:20:50.848524] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.991 #9 NEW cov: 12476 ft: 13794 corp: 5/85b lim: 30 exec/s: 0 rss: 72Mb L: 17/23 MS: 4 CopyPart-InsertByte-CrossOver-InsertRepeatedBytes- 00:07:54.991 [2024-11-18 14:20:50.888184] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:54.991 [2024-11-18 14:20:50.888296] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:54.991 [2024-11-18 14:20:50.888498] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.991 [2024-11-18 14:20:50.888528] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.991 [2024-11-18 14:20:50.888579] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.991 [2024-11-18 14:20:50.888594] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.991 #10 NEW cov: 12476 ft: 13939 corp: 6/99b lim: 30 exec/s: 0 rss: 72Mb L: 14/23 MS: 1 EraseBytes- 00:07:54.991 [2024-11-18 14:20:50.948395] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:54.991 [2024-11-18 14:20:50.948522] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:54.991 [2024-11-18 14:20:50.948631] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300000a0a 00:07:54.991 [2024-11-18 14:20:50.948843] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.991 [2024-11-18 14:20:50.948869] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.991 [2024-11-18 14:20:50.948922] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.991 [2024-11-18 14:20:50.948936] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.991 [2024-11-18 14:20:50.948988] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.992 [2024-11-18 14:20:50.949003] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.992 #11 NEW cov: 12476 ft: 14028 corp: 7/118b lim: 30 exec/s: 0 rss: 72Mb L: 19/23 MS: 1 CopyPart- 00:07:54.992 [2024-11-18 14:20:50.988530] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000a3a3 00:07:54.992 [2024-11-18 14:20:50.988648] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000e0a3 00:07:54.992 [2024-11-18 14:20:50.988756] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000a3a3 00:07:54.992 [2024-11-18 14:20:50.988972] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aa383a3 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.992 [2024-11-18 14:20:50.988997] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.992 [2024-11-18 14:20:50.989053] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:a3a383a3 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.992 [2024-11-18 14:20:50.989067] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.992 [2024-11-18 14:20:50.989120] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:a3a383a3 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.992 [2024-11-18 14:20:50.989134] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.992 #12 NEW cov: 12476 ft: 14114 corp: 8/140b lim: 30 exec/s: 0 rss: 73Mb L: 22/23 MS: 1 CopyPart- 00:07:54.992 [2024-11-18 14:20:51.048670] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000b1b1 00:07:54.992 [2024-11-18 14:20:51.048787] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000b1b1 00:07:54.992 [2024-11-18 14:20:51.048903] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000b1b1 00:07:54.992 [2024-11-18 14:20:51.049107] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:dbb181b1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.992 [2024-11-18 14:20:51.049134] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.992 [2024-11-18 14:20:51.049188] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:b1b181b1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.992 [2024-11-18 14:20:51.049202] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.992 [2024-11-18 14:20:51.049256] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:b1b181b1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.992 [2024-11-18 14:20:51.049269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.992 #16 NEW cov: 12476 ft: 14210 corp: 9/162b lim: 30 exec/s: 0 rss: 73Mb L: 22/23 MS: 4 ChangeBit-CrossOver-ChangeByte-InsertRepeatedBytes- 00:07:54.992 [2024-11-18 14:20:51.088737] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:54.992 [2024-11-18 14:20:51.088850] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:54.992 [2024-11-18 14:20:51.089064] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.992 [2024-11-18 14:20:51.089088] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.992 [2024-11-18 14:20:51.089141] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.992 [2024-11-18 14:20:51.089155] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.252 #17 NEW cov: 12476 ft: 14234 corp: 10/176b lim: 30 exec/s: 0 rss: 73Mb L: 14/23 MS: 1 ChangeBit- 00:07:55.252 [2024-11-18 14:20:51.148946] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000a3a3 00:07:55.252 [2024-11-18 14:20:51.149061] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000a18b 00:07:55.252 [2024-11-18 14:20:51.149167] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000a3a3 00:07:55.252 [2024-11-18 14:20:51.149368] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aa383a3 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.252 [2024-11-18 14:20:51.149394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.252 [2024-11-18 14:20:51.149448] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:a1d183fd cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.252 [2024-11-18 14:20:51.149462] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.252 [2024-11-18 14:20:51.149515] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:8b0083a3 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.252 [2024-11-18 14:20:51.149529] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.252 #18 NEW cov: 12476 ft: 14306 corp: 11/198b lim: 30 exec/s: 0 rss: 73Mb L: 22/23 MS: 1 CMP- DE: "\241\321\375\223\241\213\213\000"- 00:07:55.252 [2024-11-18 14:20:51.189066] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000a3a3 00:07:55.252 [2024-11-18 14:20:51.189179] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000a131 00:07:55.252 [2024-11-18 14:20:51.189283] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000a3a3 00:07:55.252 [2024-11-18 14:20:51.189487] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aa383a3 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.252 [2024-11-18 14:20:51.189512] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.252 [2024-11-18 14:20:51.189577] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:a1d183fd cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.252 [2024-11-18 14:20:51.189591] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.252 [2024-11-18 14:20:51.189642] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:8b0083a3 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.252 [2024-11-18 14:20:51.189655] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.252 #19 NEW cov: 12476 ft: 14317 corp: 12/220b lim: 30 exec/s: 0 rss: 73Mb L: 22/23 MS: 1 ChangeByte- 00:07:55.252 [2024-11-18 14:20:51.249191] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:55.252 [2024-11-18 14:20:51.249304] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:55.252 [2024-11-18 14:20:51.249523] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.252 [2024-11-18 14:20:51.249553] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.252 [2024-11-18 14:20:51.249609] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.252 [2024-11-18 14:20:51.249623] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.252 #20 NEW cov: 12476 ft: 14339 corp: 13/237b lim: 30 exec/s: 0 rss: 73Mb L: 17/23 MS: 1 ShuffleBytes- 00:07:55.252 [2024-11-18 14:20:51.289346] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x1000093a1 00:07:55.252 [2024-11-18 14:20:51.289463] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000e0a3 00:07:55.252 [2024-11-18 14:20:51.289574] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000a3a3 00:07:55.252 [2024-11-18 14:20:51.289773] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aa181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.252 [2024-11-18 14:20:51.289798] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.252 [2024-11-18 14:20:51.289854] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:8b8b8300 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.252 [2024-11-18 14:20:51.289868] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.252 [2024-11-18 14:20:51.289922] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:a3a383a3 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.252 [2024-11-18 14:20:51.289936] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.252 NEW_FUNC[1/1]: 0x1c52b68 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:07:55.252 #21 NEW cov: 12499 ft: 14438 corp: 14/259b lim: 30 exec/s: 0 rss: 73Mb L: 22/23 MS: 1 PersAutoDict- DE: "\241\321\375\223\241\213\213\000"- 00:07:55.252 [2024-11-18 14:20:51.349547] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x1000093a1 00:07:55.252 [2024-11-18 14:20:51.349666] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000e0a3 00:07:55.252 [2024-11-18 14:20:51.349773] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000a1d1 00:07:55.252 [2024-11-18 14:20:51.349880] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000a3a3 00:07:55.252 [2024-11-18 14:20:51.350081] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aa181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.252 [2024-11-18 14:20:51.350109] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.252 [2024-11-18 14:20:51.350163] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:8b8b8300 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.252 [2024-11-18 14:20:51.350176] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.252 [2024-11-18 14:20:51.350228] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:a3a383a3 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.252 [2024-11-18 14:20:51.350242] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.252 [2024-11-18 14:20:51.350292] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:fd9383a1 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.252 [2024-11-18 14:20:51.350305] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.513 #22 NEW cov: 12499 ft: 14936 corp: 15/286b lim: 30 exec/s: 22 rss: 73Mb L: 27/27 MS: 1 CopyPart- 00:07:55.513 [2024-11-18 14:20:51.409679] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000a3a3 00:07:55.513 [2024-11-18 14:20:51.409794] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000a3a3 00:07:55.513 [2024-11-18 14:20:51.409901] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000a3a3 00:07:55.513 [2024-11-18 14:20:51.410101] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aa383a3 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.513 [2024-11-18 14:20:51.410127] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.513 [2024-11-18 14:20:51.410180] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:a3a383a3 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.513 [2024-11-18 14:20:51.410195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.513 [2024-11-18 14:20:51.410248] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:a3a383e0 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.513 [2024-11-18 14:20:51.410262] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.513 #23 NEW cov: 12499 ft: 14968 corp: 16/308b lim: 30 exec/s: 23 rss: 73Mb L: 22/27 MS: 1 CopyPart- 00:07:55.513 [2024-11-18 14:20:51.449699] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x1000093a1 00:07:55.513 [2024-11-18 14:20:51.449903] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aa181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.513 [2024-11-18 14:20:51.449928] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.513 #24 NEW cov: 12499 ft: 15343 corp: 17/318b lim: 30 exec/s: 24 rss: 73Mb L: 10/27 MS: 1 CrossOver- 00:07:55.513 [2024-11-18 14:20:51.489887] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x1000093a1 00:07:55.513 [2024-11-18 14:20:51.490004] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000e0a3 00:07:55.513 [2024-11-18 14:20:51.490111] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000a3a3 00:07:55.513 [2024-11-18 14:20:51.490316] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aa181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.513 [2024-11-18 14:20:51.490341] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.513 [2024-11-18 14:20:51.490394] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:8b8b8300 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.513 [2024-11-18 14:20:51.490411] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.513 [2024-11-18 14:20:51.490466] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:a3a383a3 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.513 [2024-11-18 14:20:51.490480] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.513 #25 NEW cov: 12499 ft: 15354 corp: 18/341b lim: 30 exec/s: 25 rss: 73Mb L: 23/27 MS: 1 InsertByte- 00:07:55.513 [2024-11-18 14:20:51.529981] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x1000093a1 00:07:55.513 [2024-11-18 14:20:51.530192] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aa181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.513 [2024-11-18 14:20:51.530217] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.513 #26 NEW cov: 12499 ft: 15364 corp: 19/352b lim: 30 exec/s: 26 rss: 73Mb L: 11/27 MS: 1 CrossOver- 00:07:55.513 [2024-11-18 14:20:51.570090] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000fd93 00:07:55.513 [2024-11-18 14:20:51.570205] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xffff 00:07:55.513 [2024-11-18 14:20:51.570408] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ffff81a1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.513 [2024-11-18 14:20:51.570433] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.513 [2024-11-18 14:20:51.570489] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:a18b008b cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.513 [2024-11-18 14:20:51.570503] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.513 #27 NEW cov: 12499 ft: 15380 corp: 20/369b lim: 30 exec/s: 27 rss: 73Mb L: 17/27 MS: 1 PersAutoDict- DE: "\241\321\375\223\241\213\213\000"- 00:07:55.513 [2024-11-18 14:20:51.630262] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x1000093a1 00:07:55.513 [2024-11-18 14:20:51.630376] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000e0a3 00:07:55.513 [2024-11-18 14:20:51.630578] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aa181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.513 [2024-11-18 14:20:51.630604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.514 [2024-11-18 14:20:51.630658] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:8b8b8300 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.514 [2024-11-18 14:20:51.630672] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.774 #28 NEW cov: 12499 ft: 15454 corp: 21/385b lim: 30 exec/s: 28 rss: 73Mb L: 16/27 MS: 1 EraseBytes- 00:07:55.774 [2024-11-18 14:20:51.690473] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:55.774 [2024-11-18 14:20:51.690596] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:55.774 [2024-11-18 14:20:51.690700] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300000a0a 00:07:55.774 [2024-11-18 14:20:51.690905] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ff3083ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.774 [2024-11-18 14:20:51.690930] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.774 [2024-11-18 14:20:51.690986] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.774 [2024-11-18 14:20:51.691004] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.774 [2024-11-18 14:20:51.691057] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.774 [2024-11-18 14:20:51.691071] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.774 #29 NEW cov: 12499 ft: 15466 corp: 22/404b lim: 30 exec/s: 29 rss: 73Mb L: 19/27 MS: 1 ChangeByte- 00:07:55.774 [2024-11-18 14:20:51.750663] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x1000093a1 00:07:55.774 [2024-11-18 14:20:51.750778] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000e0a3 00:07:55.774 [2024-11-18 14:20:51.750888] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000a1d1 00:07:55.774 [2024-11-18 14:20:51.750992] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000a3a3 00:07:55.774 [2024-11-18 14:20:51.751209] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aa181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.774 [2024-11-18 14:20:51.751235] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.774 [2024-11-18 14:20:51.751293] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:8b8b831b cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.774 [2024-11-18 14:20:51.751307] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.774 [2024-11-18 14:20:51.751360] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:a3a383a3 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.774 [2024-11-18 14:20:51.751374] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.774 [2024-11-18 14:20:51.751424] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:fd9383a1 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.774 [2024-11-18 14:20:51.751438] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.774 #30 NEW cov: 12499 ft: 15517 corp: 23/431b lim: 30 exec/s: 30 rss: 74Mb L: 27/27 MS: 1 ChangeBinInt- 00:07:55.774 [2024-11-18 14:20:51.810741] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x1000093a1 00:07:55.774 [2024-11-18 14:20:51.810968] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aa181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.774 [2024-11-18 14:20:51.810993] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.774 #31 NEW cov: 12499 ft: 15574 corp: 24/441b lim: 30 exec/s: 31 rss: 74Mb L: 10/27 MS: 1 ChangeByte- 00:07:55.774 [2024-11-18 14:20:51.870922] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000fd93 00:07:55.774 [2024-11-18 14:20:51.871036] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (892372) > buf size (4096) 00:07:55.774 [2024-11-18 14:20:51.871235] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ffff81a1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.774 [2024-11-18 14:20:51.871259] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.774 [2024-11-18 14:20:51.871314] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:67748374 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.774 [2024-11-18 14:20:51.871328] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.035 #32 NEW cov: 12522 ft: 15616 corp: 25/458b lim: 30 exec/s: 32 rss: 74Mb L: 17/27 MS: 1 ChangeBinInt- 00:07:56.035 [2024-11-18 14:20:51.931116] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000a3a3 00:07:56.035 [2024-11-18 14:20:51.931230] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000a3a3 00:07:56.035 [2024-11-18 14:20:51.931338] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000a3a3 00:07:56.035 [2024-11-18 14:20:51.931547] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aa383a3 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.035 [2024-11-18 14:20:51.931577] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.035 [2024-11-18 14:20:51.931632] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:a3a383a3 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.035 [2024-11-18 14:20:51.931647] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.035 [2024-11-18 14:20:51.931699] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:a3a383e0 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.035 [2024-11-18 14:20:51.931711] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.035 #33 NEW cov: 12522 ft: 15622 corp: 26/481b lim: 30 exec/s: 33 rss: 74Mb L: 23/27 MS: 1 InsertByte- 00:07:56.035 [2024-11-18 14:20:51.991297] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x1000093a1 00:07:56.035 [2024-11-18 14:20:51.991412] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000a3a3 00:07:56.035 [2024-11-18 14:20:51.991624] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aa181d1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.035 [2024-11-18 14:20:51.991649] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.035 [2024-11-18 14:20:51.991703] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:6fa383a3 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.035 [2024-11-18 14:20:51.991718] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.035 #34 NEW cov: 12522 ft: 15694 corp: 27/493b lim: 30 exec/s: 34 rss: 74Mb L: 12/27 MS: 1 InsertByte- 00:07:56.035 [2024-11-18 14:20:52.051482] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000ffff 00:07:56.035 [2024-11-18 14:20:52.051604] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:56.035 [2024-11-18 14:20:52.051712] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000aa3 00:07:56.035 [2024-11-18 14:20:52.051913] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ffff02ff cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.035 [2024-11-18 14:20:52.051937] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.035 [2024-11-18 14:20:52.051993] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.035 [2024-11-18 14:20:52.052007] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.035 [2024-11-18 14:20:52.052061] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff02ff cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.035 [2024-11-18 14:20:52.052075] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.035 #35 NEW cov: 12522 ft: 15706 corp: 28/511b lim: 30 exec/s: 35 rss: 74Mb L: 18/27 MS: 1 InsertByte- 00:07:56.035 [2024-11-18 14:20:52.091545] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d1fd 00:07:56.035 [2024-11-18 14:20:52.091665] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x3000000ff 00:07:56.035 [2024-11-18 14:20:52.091891] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ffff81a1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.035 [2024-11-18 14:20:52.091916] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.035 [2024-11-18 14:20:52.091972] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:93a1838b cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.035 [2024-11-18 14:20:52.091986] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.035 #36 NEW cov: 12522 ft: 15747 corp: 29/528b lim: 30 exec/s: 36 rss: 74Mb L: 17/27 MS: 1 CopyPart- 00:07:56.035 [2024-11-18 14:20:52.131665] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000a3a3 00:07:56.035 [2024-11-18 14:20:52.131781] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000a3a3 00:07:56.035 [2024-11-18 14:20:52.131889] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000a3a3 00:07:56.035 [2024-11-18 14:20:52.132103] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aa383a3 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.035 [2024-11-18 14:20:52.132128] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.035 [2024-11-18 14:20:52.132182] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:a3a381a3 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.035 [2024-11-18 14:20:52.132195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.035 [2024-11-18 14:20:52.132249] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:a3a383e0 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.035 [2024-11-18 14:20:52.132262] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.300 #37 NEW cov: 12522 ft: 15763 corp: 30/551b lim: 30 exec/s: 37 rss: 74Mb L: 23/27 MS: 1 ChangeBinInt- 00:07:56.300 [2024-11-18 14:20:52.191867] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000a306 00:07:56.300 [2024-11-18 14:20:52.191981] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (954000) > buf size (4096) 00:07:56.300 [2024-11-18 14:20:52.192091] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000a3a3 00:07:56.300 [2024-11-18 14:20:52.192304] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aa383a3 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.300 [2024-11-18 14:20:52.192330] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.300 [2024-11-18 14:20:52.192387] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:a3a383a3 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.300 [2024-11-18 14:20:52.192400] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.300 [2024-11-18 14:20:52.192456] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:a3a383a3 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.300 [2024-11-18 14:20:52.192469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.300 #38 NEW cov: 12522 ft: 15765 corp: 31/574b lim: 30 exec/s: 38 rss: 74Mb L: 23/27 MS: 1 InsertByte- 00:07:56.300 [2024-11-18 14:20:52.232024] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000a3a3 00:07:56.300 [2024-11-18 14:20:52.232145] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000a3a3 00:07:56.300 [2024-11-18 14:20:52.232253] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000a3a3 00:07:56.300 [2024-11-18 14:20:52.232356] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000a3a3 00:07:56.300 [2024-11-18 14:20:52.232571] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a0a83a3 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.300 [2024-11-18 14:20:52.232596] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.300 [2024-11-18 14:20:52.232651] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:a3a383a3 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.300 [2024-11-18 14:20:52.232665] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.300 [2024-11-18 14:20:52.232721] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:a3a383a3 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.300 [2024-11-18 14:20:52.232734] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.300 [2024-11-18 14:20:52.232788] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:a3a383a3 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.300 [2024-11-18 14:20:52.232801] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.300 #39 NEW cov: 12522 ft: 15788 corp: 32/602b lim: 30 exec/s: 39 rss: 74Mb L: 28/28 MS: 1 CrossOver- 00:07:56.300 [2024-11-18 14:20:52.272088] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000a3a3 00:07:56.300 [2024-11-18 14:20:52.272202] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000a3a3 00:07:56.300 [2024-11-18 14:20:52.272308] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000a3a3 00:07:56.300 [2024-11-18 14:20:52.272515] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aa383a3 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.300 [2024-11-18 14:20:52.272540] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.300 [2024-11-18 14:20:52.272598] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:a7a383a3 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.300 [2024-11-18 14:20:52.272612] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.300 [2024-11-18 14:20:52.272666] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:a3a383e0 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.300 [2024-11-18 14:20:52.272678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.300 #45 NEW cov: 12522 ft: 15807 corp: 33/624b lim: 30 exec/s: 45 rss: 74Mb L: 22/28 MS: 1 ChangeBit- 00:07:56.301 [2024-11-18 14:20:52.312095] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000a18b 00:07:56.301 [2024-11-18 14:20:52.312311] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:a1d183fd cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.301 [2024-11-18 14:20:52.312336] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.301 #46 NEW cov: 12522 ft: 15829 corp: 34/633b lim: 30 exec/s: 46 rss: 74Mb L: 9/28 MS: 1 EraseBytes- 00:07:56.301 [2024-11-18 14:20:52.352215] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300008ba1 00:07:56.301 [2024-11-18 14:20:52.352430] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:a1d183fd cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.301 [2024-11-18 14:20:52.352457] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.301 #47 NEW cov: 12522 ft: 15854 corp: 35/642b lim: 30 exec/s: 23 rss: 74Mb L: 9/28 MS: 1 ShuffleBytes- 00:07:56.301 #47 DONE cov: 12522 ft: 15854 corp: 35/642b lim: 30 exec/s: 23 rss: 74Mb 00:07:56.301 ###### Recommended dictionary. ###### 00:07:56.301 "\241\321\375\223\241\213\213\000" # Uses: 2 00:07:56.301 ###### End of recommended dictionary. ###### 00:07:56.301 Done 47 runs in 2 second(s) 00:07:56.561 14:20:52 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_1.conf /var/tmp/suppress_nvmf_fuzz 00:07:56.561 14:20:52 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:07:56.561 14:20:52 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:56.561 14:20:52 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 2 1 0x1 00:07:56.561 14:20:52 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=2 00:07:56.561 14:20:52 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:07:56.561 14:20:52 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:07:56.561 14:20:52 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:07:56.561 14:20:52 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_2.conf 00:07:56.561 14:20:52 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:56.561 14:20:52 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:56.561 14:20:52 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 2 00:07:56.561 14:20:52 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4402 00:07:56.561 14:20:52 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:07:56.561 14:20:52 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4402' 00:07:56.561 14:20:52 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4402"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:56.561 14:20:52 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:56.561 14:20:52 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:56.561 14:20:52 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4402' -c /tmp/fuzz_json_2.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 -Z 2 00:07:56.561 [2024-11-18 14:20:52.539359] Starting SPDK v25.01-pre git sha1 d47eb51c9 / DPDK 23.11.0 initialization... 00:07:56.561 [2024-11-18 14:20:52.539427] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid322857 ] 00:07:56.821 [2024-11-18 14:20:52.738753] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:56.821 [2024-11-18 14:20:52.751838] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:56.821 [2024-11-18 14:20:52.804199] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:56.821 [2024-11-18 14:20:52.820525] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4402 *** 00:07:56.821 INFO: Running with entropic power schedule (0xFF, 100). 00:07:56.821 INFO: Seed: 4075688733 00:07:56.821 INFO: Loaded 1 modules (387495 inline 8-bit counters): 387495 [0x2acbe4c, 0x2b2a7f3), 00:07:56.821 INFO: Loaded 1 PC tables (387495 PCs): 387495 [0x2b2a7f8,0x3114268), 00:07:56.821 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:07:56.821 INFO: A corpus is not provided, starting from an empty corpus 00:07:56.821 #2 INITED exec/s: 0 rss: 65Mb 00:07:56.821 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:56.821 This may also happen if the target rejected all inputs we tried so far 00:07:56.821 [2024-11-18 14:20:52.886728] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:56.821 [2024-11-18 14:20:52.886913] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:56.821 [2024-11-18 14:20:52.887074] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:56.821 [2024-11-18 14:20:52.887413] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000027 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.821 [2024-11-18 14:20:52.887450] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.821 [2024-11-18 14:20:52.887578] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.821 [2024-11-18 14:20:52.887603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.821 [2024-11-18 14:20:52.887717] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.821 [2024-11-18 14:20:52.887742] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.821 [2024-11-18 14:20:52.887874] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.821 [2024-11-18 14:20:52.887904] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.391 NEW_FUNC[1/715]: 0x45c9f8 in fuzz_admin_identify_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:95 00:07:57.391 NEW_FUNC[2/715]: 0x495e28 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:57.392 #11 NEW cov: 12238 ft: 12239 corp: 2/33b lim: 35 exec/s: 0 rss: 72Mb L: 32/32 MS: 4 ChangeByte-ShuffleBytes-ChangeBinInt-InsertRepeatedBytes- 00:07:57.392 [2024-11-18 14:20:53.238361] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff000a cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.392 [2024-11-18 14:20:53.238411] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.392 [2024-11-18 14:20:53.238570] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.392 [2024-11-18 14:20:53.238595] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.392 [2024-11-18 14:20:53.238728] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.392 [2024-11-18 14:20:53.238752] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.392 #12 NEW cov: 12352 ft: 13548 corp: 3/60b lim: 35 exec/s: 0 rss: 72Mb L: 27/32 MS: 1 InsertRepeatedBytes- 00:07:57.392 [2024-11-18 14:20:53.298678] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff000a cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.392 [2024-11-18 14:20:53.298709] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.392 [2024-11-18 14:20:53.298836] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.392 [2024-11-18 14:20:53.298853] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.392 [2024-11-18 14:20:53.298980] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.392 [2024-11-18 14:20:53.298997] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.392 [2024-11-18 14:20:53.299126] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.392 [2024-11-18 14:20:53.299143] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.392 [2024-11-18 14:20:53.299275] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:8 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.392 [2024-11-18 14:20:53.299291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:57.392 #13 NEW cov: 12358 ft: 13753 corp: 4/95b lim: 35 exec/s: 0 rss: 72Mb L: 35/35 MS: 1 CrossOver- 00:07:57.392 [2024-11-18 14:20:53.367845] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:57.392 [2024-11-18 14:20:53.368216] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000027 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.392 [2024-11-18 14:20:53.368247] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.392 [2024-11-18 14:20:53.368371] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.392 [2024-11-18 14:20:53.368394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.392 #14 NEW cov: 12443 ft: 14272 corp: 5/113b lim: 35 exec/s: 0 rss: 72Mb L: 18/35 MS: 1 EraseBytes- 00:07:57.392 [2024-11-18 14:20:53.439086] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff000a cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.392 [2024-11-18 14:20:53.439113] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.392 [2024-11-18 14:20:53.439251] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.392 [2024-11-18 14:20:53.439270] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.392 [2024-11-18 14:20:53.439407] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.392 [2024-11-18 14:20:53.439423] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.392 [2024-11-18 14:20:53.439554] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.392 [2024-11-18 14:20:53.439572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.392 [2024-11-18 14:20:53.439701] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:8 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.392 [2024-11-18 14:20:53.439721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:57.392 #15 NEW cov: 12443 ft: 14331 corp: 6/148b lim: 35 exec/s: 0 rss: 72Mb L: 35/35 MS: 1 ShuffleBytes- 00:07:57.392 [2024-11-18 14:20:53.508400] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:57.392 [2024-11-18 14:20:53.508594] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:57.392 [2024-11-18 14:20:53.508781] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:57.392 [2024-11-18 14:20:53.509156] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000027 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.392 [2024-11-18 14:20:53.509186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.392 [2024-11-18 14:20:53.509321] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.392 [2024-11-18 14:20:53.509345] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.392 [2024-11-18 14:20:53.509477] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.392 [2024-11-18 14:20:53.509505] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.392 [2024-11-18 14:20:53.509649] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.392 [2024-11-18 14:20:53.509675] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.652 #16 NEW cov: 12443 ft: 14383 corp: 7/180b lim: 35 exec/s: 0 rss: 72Mb L: 32/35 MS: 1 ChangeBit- 00:07:57.652 [2024-11-18 14:20:53.559393] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff000a cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.652 [2024-11-18 14:20:53.559421] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.652 [2024-11-18 14:20:53.559546] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.652 [2024-11-18 14:20:53.559567] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.652 [2024-11-18 14:20:53.559695] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.652 [2024-11-18 14:20:53.559713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.652 [2024-11-18 14:20:53.559838] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff0009 cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.652 [2024-11-18 14:20:53.559855] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.652 [2024-11-18 14:20:53.559980] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:8 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.652 [2024-11-18 14:20:53.559996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:57.652 #17 NEW cov: 12443 ft: 14537 corp: 8/215b lim: 35 exec/s: 0 rss: 73Mb L: 35/35 MS: 1 ChangeBinInt- 00:07:57.652 [2024-11-18 14:20:53.628799] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:57.652 [2024-11-18 14:20:53.628976] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:57.652 [2024-11-18 14:20:53.629124] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:57.652 [2024-11-18 14:20:53.629487] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000027 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.652 [2024-11-18 14:20:53.629515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.653 [2024-11-18 14:20:53.629650] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.653 [2024-11-18 14:20:53.629675] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.653 [2024-11-18 14:20:53.629795] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.653 [2024-11-18 14:20:53.629820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.653 [2024-11-18 14:20:53.629937] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.653 [2024-11-18 14:20:53.629965] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.653 #18 NEW cov: 12443 ft: 14580 corp: 9/247b lim: 35 exec/s: 0 rss: 73Mb L: 32/35 MS: 1 ChangeBit- 00:07:57.653 [2024-11-18 14:20:53.679805] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff000a cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.653 [2024-11-18 14:20:53.679830] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.653 [2024-11-18 14:20:53.679960] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.653 [2024-11-18 14:20:53.679977] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.653 [2024-11-18 14:20:53.680091] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.653 [2024-11-18 14:20:53.680110] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.653 [2024-11-18 14:20:53.680237] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.653 [2024-11-18 14:20:53.680253] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.653 [2024-11-18 14:20:53.680372] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:8 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.653 [2024-11-18 14:20:53.680390] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:57.653 #19 NEW cov: 12443 ft: 14630 corp: 10/282b lim: 35 exec/s: 0 rss: 73Mb L: 35/35 MS: 1 ShuffleBytes- 00:07:57.653 [2024-11-18 14:20:53.729608] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:57.653 [2024-11-18 14:20:53.729973] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff000a cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.653 [2024-11-18 14:20:53.729999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.653 [2024-11-18 14:20:53.730131] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.653 [2024-11-18 14:20:53.730148] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.653 [2024-11-18 14:20:53.730271] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.653 [2024-11-18 14:20:53.730290] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.653 [2024-11-18 14:20:53.730425] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff0009 cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.653 [2024-11-18 14:20:53.730445] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.653 [2024-11-18 14:20:53.730571] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:8 nsid:0 cdw10:ffff0000 cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.653 [2024-11-18 14:20:53.730604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:57.653 NEW_FUNC[1/1]: 0x1c52b68 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:07:57.653 #20 NEW cov: 12466 ft: 14699 corp: 11/317b lim: 35 exec/s: 0 rss: 73Mb L: 35/35 MS: 1 CrossOver- 00:07:57.913 [2024-11-18 14:20:53.799982] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff000a cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.913 [2024-11-18 14:20:53.800008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.913 [2024-11-18 14:20:53.800141] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.913 [2024-11-18 14:20:53.800158] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.913 [2024-11-18 14:20:53.800282] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.913 [2024-11-18 14:20:53.800298] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.913 [2024-11-18 14:20:53.800420] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.913 [2024-11-18 14:20:53.800440] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.913 #21 NEW cov: 12466 ft: 14731 corp: 12/345b lim: 35 exec/s: 0 rss: 73Mb L: 28/35 MS: 1 EraseBytes- 00:07:57.913 [2024-11-18 14:20:53.869918] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff000a cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.913 [2024-11-18 14:20:53.869947] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.913 [2024-11-18 14:20:53.870072] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.913 [2024-11-18 14:20:53.870091] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.913 [2024-11-18 14:20:53.870213] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.913 [2024-11-18 14:20:53.870229] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.913 #22 NEW cov: 12466 ft: 14787 corp: 13/368b lim: 35 exec/s: 22 rss: 73Mb L: 23/35 MS: 1 EraseBytes- 00:07:57.913 [2024-11-18 14:20:53.920326] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff000a cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.913 [2024-11-18 14:20:53.920352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.913 [2024-11-18 14:20:53.920486] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.913 [2024-11-18 14:20:53.920505] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.913 [2024-11-18 14:20:53.920644] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.913 [2024-11-18 14:20:53.920661] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.913 [2024-11-18 14:20:53.920787] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.913 [2024-11-18 14:20:53.920805] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.913 #23 NEW cov: 12466 ft: 14807 corp: 14/399b lim: 35 exec/s: 23 rss: 73Mb L: 31/35 MS: 1 EraseBytes- 00:07:57.913 [2024-11-18 14:20:53.970710] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff000a cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.914 [2024-11-18 14:20:53.970736] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.914 [2024-11-18 14:20:53.970869] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.914 [2024-11-18 14:20:53.970888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.914 [2024-11-18 14:20:53.971015] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.914 [2024-11-18 14:20:53.971033] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.914 [2024-11-18 14:20:53.971167] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.914 [2024-11-18 14:20:53.971183] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.914 [2024-11-18 14:20:53.971318] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:8 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.914 [2024-11-18 14:20:53.971335] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:57.914 #24 NEW cov: 12466 ft: 14848 corp: 15/434b lim: 35 exec/s: 24 rss: 73Mb L: 35/35 MS: 1 CopyPart- 00:07:57.914 [2024-11-18 14:20:54.020659] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff000a cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.914 [2024-11-18 14:20:54.020687] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.914 [2024-11-18 14:20:54.020816] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.914 [2024-11-18 14:20:54.020835] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.914 [2024-11-18 14:20:54.020971] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.914 [2024-11-18 14:20:54.020989] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.914 [2024-11-18 14:20:54.021119] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.914 [2024-11-18 14:20:54.021137] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.174 #25 NEW cov: 12466 ft: 14869 corp: 16/462b lim: 35 exec/s: 25 rss: 73Mb L: 28/35 MS: 1 InsertByte- 00:07:58.174 [2024-11-18 14:20:54.071057] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff000a cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.174 [2024-11-18 14:20:54.071084] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.174 [2024-11-18 14:20:54.071215] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00fbff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.174 [2024-11-18 14:20:54.071233] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.174 [2024-11-18 14:20:54.071371] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.174 [2024-11-18 14:20:54.071388] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.174 [2024-11-18 14:20:54.071521] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff0009 cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.174 [2024-11-18 14:20:54.071537] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.174 [2024-11-18 14:20:54.071674] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:8 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.174 [2024-11-18 14:20:54.071691] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:58.174 #26 NEW cov: 12466 ft: 14874 corp: 17/497b lim: 35 exec/s: 26 rss: 73Mb L: 35/35 MS: 1 ChangeBit- 00:07:58.174 [2024-11-18 14:20:54.120700] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:58.174 [2024-11-18 14:20:54.121214] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff000a cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.174 [2024-11-18 14:20:54.121243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.174 [2024-11-18 14:20:54.121369] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.174 [2024-11-18 14:20:54.121388] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.174 [2024-11-18 14:20:54.121513] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ff0100ff cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.174 [2024-11-18 14:20:54.121531] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.174 [2024-11-18 14:20:54.121665] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.174 [2024-11-18 14:20:54.121691] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.174 [2024-11-18 14:20:54.121820] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:8 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.174 [2024-11-18 14:20:54.121838] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:58.174 #27 NEW cov: 12466 ft: 14898 corp: 18/532b lim: 35 exec/s: 27 rss: 73Mb L: 35/35 MS: 1 ChangeBinInt- 00:07:58.174 [2024-11-18 14:20:54.170914] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff000a cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.174 [2024-11-18 14:20:54.170945] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.174 [2024-11-18 14:20:54.171079] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.174 [2024-11-18 14:20:54.171095] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.174 [2024-11-18 14:20:54.171225] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.174 [2024-11-18 14:20:54.171244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.174 #28 NEW cov: 12466 ft: 14926 corp: 19/555b lim: 35 exec/s: 28 rss: 73Mb L: 23/35 MS: 1 ShuffleBytes- 00:07:58.174 [2024-11-18 14:20:54.240686] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff000a cdw11:ff000aff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.174 [2024-11-18 14:20:54.240713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.174 #29 NEW cov: 12466 ft: 15212 corp: 20/567b lim: 35 exec/s: 29 rss: 73Mb L: 12/35 MS: 1 CrossOver- 00:07:58.435 [2024-11-18 14:20:54.311224] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:58.435 [2024-11-18 14:20:54.311586] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000027 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.435 [2024-11-18 14:20:54.311613] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.435 [2024-11-18 14:20:54.311743] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:a3a300a3 cdw11:a300a3a3 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.435 [2024-11-18 14:20:54.311759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.435 [2024-11-18 14:20:54.311888] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:a3a300a3 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.435 [2024-11-18 14:20:54.311908] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.435 [2024-11-18 14:20:54.312045] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.435 [2024-11-18 14:20:54.312070] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.435 #30 NEW cov: 12466 ft: 15225 corp: 21/595b lim: 35 exec/s: 30 rss: 73Mb L: 28/35 MS: 1 InsertRepeatedBytes- 00:07:58.435 [2024-11-18 14:20:54.381794] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff000a cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.435 [2024-11-18 14:20:54.381824] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.435 [2024-11-18 14:20:54.381950] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.435 [2024-11-18 14:20:54.381969] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.435 [2024-11-18 14:20:54.382105] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.435 [2024-11-18 14:20:54.382122] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.435 [2024-11-18 14:20:54.382257] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00a3 cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.435 [2024-11-18 14:20:54.382277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.435 #31 NEW cov: 12466 ft: 15231 corp: 22/624b lim: 35 exec/s: 31 rss: 73Mb L: 29/35 MS: 1 InsertByte- 00:07:58.435 [2024-11-18 14:20:54.451512] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff000a cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.435 [2024-11-18 14:20:54.451541] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.435 [2024-11-18 14:20:54.451685] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff000aff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.435 [2024-11-18 14:20:54.451701] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.435 #32 NEW cov: 12466 ft: 15255 corp: 23/640b lim: 35 exec/s: 32 rss: 73Mb L: 16/35 MS: 1 CrossOver- 00:07:58.435 [2024-11-18 14:20:54.501488] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:58.435 [2024-11-18 14:20:54.501689] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:58.435 [2024-11-18 14:20:54.501860] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:58.435 [2024-11-18 14:20:54.502224] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000027 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.435 [2024-11-18 14:20:54.502254] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.435 [2024-11-18 14:20:54.502378] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.435 [2024-11-18 14:20:54.502405] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.435 [2024-11-18 14:20:54.502534] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.435 [2024-11-18 14:20:54.502564] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.435 [2024-11-18 14:20:54.502687] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:00020000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.435 [2024-11-18 14:20:54.502711] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.435 #33 NEW cov: 12466 ft: 15261 corp: 24/672b lim: 35 exec/s: 33 rss: 73Mb L: 32/35 MS: 1 ChangeBit- 00:07:58.695 [2024-11-18 14:20:54.572328] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff000a cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.695 [2024-11-18 14:20:54.572355] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.695 [2024-11-18 14:20:54.572476] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.695 [2024-11-18 14:20:54.572497] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.695 [2024-11-18 14:20:54.572628] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:00000100 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.695 [2024-11-18 14:20:54.572648] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.695 [2024-11-18 14:20:54.572777] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.695 [2024-11-18 14:20:54.572798] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.695 #34 NEW cov: 12466 ft: 15275 corp: 25/703b lim: 35 exec/s: 34 rss: 74Mb L: 31/35 MS: 1 ChangeBinInt- 00:07:58.695 [2024-11-18 14:20:54.642140] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff000a cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.695 [2024-11-18 14:20:54.642168] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.695 [2024-11-18 14:20:54.642286] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.695 [2024-11-18 14:20:54.642303] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.695 #35 NEW cov: 12466 ft: 15335 corp: 26/721b lim: 35 exec/s: 35 rss: 74Mb L: 18/35 MS: 1 EraseBytes- 00:07:58.695 [2024-11-18 14:20:54.692022] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff000a cdw11:ff000aff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.695 [2024-11-18 14:20:54.692052] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.695 #36 NEW cov: 12466 ft: 15348 corp: 27/733b lim: 35 exec/s: 36 rss: 74Mb L: 12/35 MS: 1 ChangeByte- 00:07:58.695 [2024-11-18 14:20:54.762372] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:58.695 [2024-11-18 14:20:54.763053] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ff02000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.695 [2024-11-18 14:20:54.763085] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.695 [2024-11-18 14:20:54.763211] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.695 [2024-11-18 14:20:54.763238] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.695 [2024-11-18 14:20:54.763376] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.695 [2024-11-18 14:20:54.763394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.695 [2024-11-18 14:20:54.763524] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00a3 cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.695 [2024-11-18 14:20:54.763545] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.695 #37 NEW cov: 12466 ft: 15365 corp: 28/762b lim: 35 exec/s: 37 rss: 74Mb L: 29/35 MS: 1 CMP- DE: "\002\000\000\000\000\000\000\000"- 00:07:58.955 [2024-11-18 14:20:54.833112] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff000a cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.955 [2024-11-18 14:20:54.833140] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.955 [2024-11-18 14:20:54.833287] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.955 [2024-11-18 14:20:54.833305] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.955 [2024-11-18 14:20:54.833427] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.955 [2024-11-18 14:20:54.833451] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.955 [2024-11-18 14:20:54.833587] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.955 [2024-11-18 14:20:54.833605] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.955 #38 NEW cov: 12466 ft: 15378 corp: 29/796b lim: 35 exec/s: 19 rss: 74Mb L: 34/35 MS: 1 InsertRepeatedBytes- 00:07:58.955 #38 DONE cov: 12466 ft: 15378 corp: 29/796b lim: 35 exec/s: 19 rss: 74Mb 00:07:58.955 ###### Recommended dictionary. ###### 00:07:58.955 "\002\000\000\000\000\000\000\000" # Uses: 0 00:07:58.955 ###### End of recommended dictionary. ###### 00:07:58.955 Done 38 runs in 2 second(s) 00:07:58.955 14:20:54 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_2.conf /var/tmp/suppress_nvmf_fuzz 00:07:58.955 14:20:54 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:07:58.955 14:20:54 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:58.955 14:20:54 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 3 1 0x1 00:07:58.955 14:20:54 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=3 00:07:58.955 14:20:54 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:07:58.955 14:20:54 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:07:58.955 14:20:54 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:07:58.955 14:20:54 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_3.conf 00:07:58.955 14:20:54 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:58.955 14:20:54 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:58.956 14:20:54 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 3 00:07:58.956 14:20:54 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4403 00:07:58.956 14:20:54 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:07:58.956 14:20:54 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4403' 00:07:58.956 14:20:54 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4403"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:58.956 14:20:54 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:58.956 14:20:54 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:58.956 14:20:54 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4403' -c /tmp/fuzz_json_3.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 -Z 3 00:07:58.956 [2024-11-18 14:20:55.019704] Starting SPDK v25.01-pre git sha1 d47eb51c9 / DPDK 23.11.0 initialization... 00:07:58.956 [2024-11-18 14:20:55.019769] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid323386 ] 00:07:59.216 [2024-11-18 14:20:55.219048] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:59.216 [2024-11-18 14:20:55.231757] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:59.216 [2024-11-18 14:20:55.284052] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:59.216 [2024-11-18 14:20:55.300339] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4403 *** 00:07:59.216 INFO: Running with entropic power schedule (0xFF, 100). 00:07:59.216 INFO: Seed: 2259695523 00:07:59.216 INFO: Loaded 1 modules (387495 inline 8-bit counters): 387495 [0x2acbe4c, 0x2b2a7f3), 00:07:59.216 INFO: Loaded 1 PC tables (387495 PCs): 387495 [0x2b2a7f8,0x3114268), 00:07:59.216 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:07:59.216 INFO: A corpus is not provided, starting from an empty corpus 00:07:59.216 #2 INITED exec/s: 0 rss: 64Mb 00:07:59.216 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:59.216 This may also happen if the target rejected all inputs we tried so far 00:07:59.475 [2024-11-18 14:20:55.359455] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:59.475 [2024-11-18 14:20:55.359485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.735 NEW_FUNC[1/724]: 0x45e6d8 in fuzz_admin_abort_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:114 00:07:59.735 NEW_FUNC[2/724]: 0x495e28 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:59.735 #3 NEW cov: 12465 ft: 12466 corp: 2/14b lim: 20 exec/s: 0 rss: 73Mb L: 13/13 MS: 1 InsertRepeatedBytes- 00:07:59.735 #8 NEW cov: 12579 ft: 13550 corp: 3/23b lim: 20 exec/s: 0 rss: 73Mb L: 9/13 MS: 5 ChangeBinInt-ShuffleBytes-ChangeBit-CrossOver-InsertRepeatedBytes- 00:07:59.735 #10 NEW cov: 12602 ft: 13904 corp: 4/41b lim: 20 exec/s: 0 rss: 73Mb L: 18/18 MS: 2 CopyPart-InsertRepeatedBytes- 00:07:59.735 #11 NEW cov: 12687 ft: 14280 corp: 5/61b lim: 20 exec/s: 0 rss: 73Mb L: 20/20 MS: 1 CopyPart- 00:07:59.735 #13 NEW cov: 12687 ft: 14355 corp: 6/70b lim: 20 exec/s: 0 rss: 73Mb L: 9/20 MS: 2 CopyPart-CrossOver- 00:07:59.996 #19 NEW cov: 12687 ft: 14426 corp: 7/88b lim: 20 exec/s: 0 rss: 73Mb L: 18/20 MS: 1 ChangeByte- 00:07:59.996 #20 NEW cov: 12687 ft: 14522 corp: 8/106b lim: 20 exec/s: 0 rss: 73Mb L: 18/20 MS: 1 ChangeBinInt- 00:07:59.996 [2024-11-18 14:20:55.981053] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:59.996 [2024-11-18 14:20:55.981084] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.996 #21 NEW cov: 12687 ft: 14544 corp: 9/119b lim: 20 exec/s: 0 rss: 73Mb L: 13/20 MS: 1 CopyPart- 00:07:59.996 #22 NEW cov: 12687 ft: 14829 corp: 10/125b lim: 20 exec/s: 0 rss: 74Mb L: 6/20 MS: 1 EraseBytes- 00:08:00.256 #23 NEW cov: 12687 ft: 14877 corp: 11/145b lim: 20 exec/s: 0 rss: 74Mb L: 20/20 MS: 1 CrossOver- 00:08:00.256 [2024-11-18 14:20:56.161555] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:08:00.256 [2024-11-18 14:20:56.161582] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.256 #24 NEW cov: 12690 ft: 14987 corp: 12/159b lim: 20 exec/s: 0 rss: 74Mb L: 14/20 MS: 1 InsertByte- 00:08:00.256 #25 NEW cov: 12690 ft: 15040 corp: 13/169b lim: 20 exec/s: 0 rss: 74Mb L: 10/20 MS: 1 InsertByte- 00:08:00.256 NEW_FUNC[1/1]: 0x1c52b68 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:00.256 #26 NEW cov: 12713 ft: 15079 corp: 14/187b lim: 20 exec/s: 0 rss: 74Mb L: 18/20 MS: 1 ChangeBit- 00:08:00.256 [2024-11-18 14:20:56.301918] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:08:00.256 [2024-11-18 14:20:56.301943] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.256 #27 NEW cov: 12713 ft: 15128 corp: 15/200b lim: 20 exec/s: 27 rss: 74Mb L: 13/20 MS: 1 ChangeBit- 00:08:00.516 #28 NEW cov: 12713 ft: 15135 corp: 16/220b lim: 20 exec/s: 28 rss: 74Mb L: 20/20 MS: 1 CopyPart- 00:08:00.516 #29 NEW cov: 12713 ft: 15178 corp: 17/238b lim: 20 exec/s: 29 rss: 74Mb L: 18/20 MS: 1 ChangeByte- 00:08:00.516 [2024-11-18 14:20:56.462216] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:08:00.516 [2024-11-18 14:20:56.462241] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.516 #30 NEW cov: 12713 ft: 15235 corp: 18/246b lim: 20 exec/s: 30 rss: 74Mb L: 8/20 MS: 1 EraseBytes- 00:08:00.516 #31 NEW cov: 12713 ft: 15245 corp: 19/264b lim: 20 exec/s: 31 rss: 74Mb L: 18/20 MS: 1 ChangeBinInt- 00:08:00.516 #32 NEW cov: 12713 ft: 15282 corp: 20/282b lim: 20 exec/s: 32 rss: 74Mb L: 18/20 MS: 1 ShuffleBytes- 00:08:00.516 #33 NEW cov: 12713 ft: 15337 corp: 21/291b lim: 20 exec/s: 33 rss: 74Mb L: 9/20 MS: 1 CrossOver- 00:08:00.776 #34 NEW cov: 12713 ft: 15345 corp: 22/311b lim: 20 exec/s: 34 rss: 74Mb L: 20/20 MS: 1 CMP- DE: "\001\000"- 00:08:00.776 #35 NEW cov: 12713 ft: 15356 corp: 23/331b lim: 20 exec/s: 35 rss: 74Mb L: 20/20 MS: 1 CrossOver- 00:08:00.776 #36 NEW cov: 12713 ft: 15369 corp: 24/351b lim: 20 exec/s: 36 rss: 74Mb L: 20/20 MS: 1 InsertRepeatedBytes- 00:08:00.776 [2024-11-18 14:20:56.803262] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:08:00.776 [2024-11-18 14:20:56.803288] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.776 #37 NEW cov: 12714 ft: 15398 corp: 25/366b lim: 20 exec/s: 37 rss: 75Mb L: 15/20 MS: 1 PersAutoDict- DE: "\001\000"- 00:08:00.776 #38 NEW cov: 12714 ft: 15423 corp: 26/372b lim: 20 exec/s: 38 rss: 75Mb L: 6/20 MS: 1 ChangeByte- 00:08:01.036 #39 NEW cov: 12714 ft: 15505 corp: 27/379b lim: 20 exec/s: 39 rss: 75Mb L: 7/20 MS: 1 EraseBytes- 00:08:01.036 #40 NEW cov: 12714 ft: 15524 corp: 28/399b lim: 20 exec/s: 40 rss: 75Mb L: 20/20 MS: 1 CopyPart- 00:08:01.036 #41 NEW cov: 12714 ft: 15533 corp: 29/419b lim: 20 exec/s: 41 rss: 75Mb L: 20/20 MS: 1 CopyPart- 00:08:01.036 #42 NEW cov: 12714 ft: 15541 corp: 30/439b lim: 20 exec/s: 42 rss: 75Mb L: 20/20 MS: 1 ChangeByte- 00:08:01.036 [2024-11-18 14:20:57.124338] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:08:01.036 [2024-11-18 14:20:57.124364] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.296 #43 NEW cov: 12714 ft: 15615 corp: 31/456b lim: 20 exec/s: 43 rss: 75Mb L: 17/20 MS: 1 CMP- DE: "\000\006"- 00:08:01.296 [2024-11-18 14:20:57.184357] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:08:01.296 [2024-11-18 14:20:57.184382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.296 #44 NEW cov: 12714 ft: 15624 corp: 32/469b lim: 20 exec/s: 44 rss: 75Mb L: 13/20 MS: 1 ChangeBinInt- 00:08:01.296 #45 NEW cov: 12714 ft: 15634 corp: 33/489b lim: 20 exec/s: 45 rss: 75Mb L: 20/20 MS: 1 PersAutoDict- DE: "\000\006"- 00:08:01.296 #46 NEW cov: 12714 ft: 15647 corp: 34/509b lim: 20 exec/s: 46 rss: 75Mb L: 20/20 MS: 1 ChangeByte- 00:08:01.296 #47 NEW cov: 12714 ft: 15654 corp: 35/529b lim: 20 exec/s: 23 rss: 75Mb L: 20/20 MS: 1 ChangeBinInt- 00:08:01.296 #47 DONE cov: 12714 ft: 15654 corp: 35/529b lim: 20 exec/s: 23 rss: 75Mb 00:08:01.296 ###### Recommended dictionary. ###### 00:08:01.296 "\001\000" # Uses: 1 00:08:01.296 "\000\006" # Uses: 1 00:08:01.296 ###### End of recommended dictionary. ###### 00:08:01.296 Done 47 runs in 2 second(s) 00:08:01.557 14:20:57 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_3.conf /var/tmp/suppress_nvmf_fuzz 00:08:01.557 14:20:57 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:01.557 14:20:57 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:01.557 14:20:57 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 4 1 0x1 00:08:01.557 14:20:57 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=4 00:08:01.557 14:20:57 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:01.557 14:20:57 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:01.557 14:20:57 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:08:01.557 14:20:57 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_4.conf 00:08:01.557 14:20:57 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:01.557 14:20:57 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:01.557 14:20:57 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 4 00:08:01.557 14:20:57 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4404 00:08:01.557 14:20:57 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:08:01.557 14:20:57 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4404' 00:08:01.557 14:20:57 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4404"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:01.557 14:20:57 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:01.557 14:20:57 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:01.557 14:20:57 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4404' -c /tmp/fuzz_json_4.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 -Z 4 00:08:01.557 [2024-11-18 14:20:57.511125] Starting SPDK v25.01-pre git sha1 d47eb51c9 / DPDK 23.11.0 initialization... 00:08:01.557 [2024-11-18 14:20:57.511209] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid323690 ] 00:08:01.817 [2024-11-18 14:20:57.712255] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:01.817 [2024-11-18 14:20:57.724850] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:01.817 [2024-11-18 14:20:57.777198] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:01.817 [2024-11-18 14:20:57.793530] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4404 *** 00:08:01.817 INFO: Running with entropic power schedule (0xFF, 100). 00:08:01.817 INFO: Seed: 457729061 00:08:01.817 INFO: Loaded 1 modules (387495 inline 8-bit counters): 387495 [0x2acbe4c, 0x2b2a7f3), 00:08:01.817 INFO: Loaded 1 PC tables (387495 PCs): 387495 [0x2b2a7f8,0x3114268), 00:08:01.817 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:08:01.817 INFO: A corpus is not provided, starting from an empty corpus 00:08:01.817 #2 INITED exec/s: 0 rss: 64Mb 00:08:01.817 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:01.817 This may also happen if the target rejected all inputs we tried so far 00:08:01.817 [2024-11-18 14:20:57.863012] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.817 [2024-11-18 14:20:57.863048] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.076 NEW_FUNC[1/716]: 0x45f7d8 in fuzz_admin_create_io_completion_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:126 00:08:02.076 NEW_FUNC[2/716]: 0x495e28 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:02.076 #19 NEW cov: 12249 ft: 12235 corp: 2/12b lim: 35 exec/s: 0 rss: 72Mb L: 11/11 MS: 2 CopyPart-InsertRepeatedBytes- 00:08:02.076 [2024-11-18 14:20:58.193885] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.076 [2024-11-18 14:20:58.193932] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.336 #20 NEW cov: 12362 ft: 12999 corp: 3/23b lim: 35 exec/s: 0 rss: 72Mb L: 11/11 MS: 1 ShuffleBytes- 00:08:02.336 [2024-11-18 14:20:58.253909] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:02002601 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.336 [2024-11-18 14:20:58.253938] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.336 #29 NEW cov: 12368 ft: 13247 corp: 4/34b lim: 35 exec/s: 0 rss: 72Mb L: 11/11 MS: 4 ChangeBinInt-InsertByte-InsertByte-CMP- DE: "\002\000\000\000\000\000\000\000"- 00:08:02.336 [2024-11-18 14:20:58.304782] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.336 [2024-11-18 14:20:58.304807] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.336 [2024-11-18 14:20:58.304933] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:0af4ffff cdw11:f4f40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.336 [2024-11-18 14:20:58.304950] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.336 [2024-11-18 14:20:58.305073] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:f4f4f4f4 cdw11:f4f40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.336 [2024-11-18 14:20:58.305088] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.336 [2024-11-18 14:20:58.305198] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:f4f4f4f4 cdw11:f4f40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.336 [2024-11-18 14:20:58.305214] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:02.336 #30 NEW cov: 12453 ft: 14271 corp: 5/67b lim: 35 exec/s: 0 rss: 72Mb L: 33/33 MS: 1 InsertRepeatedBytes- 00:08:02.336 [2024-11-18 14:20:58.375061] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.336 [2024-11-18 14:20:58.375088] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.336 [2024-11-18 14:20:58.375220] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:0af4ffff cdw11:f4f40000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.336 [2024-11-18 14:20:58.375238] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.336 [2024-11-18 14:20:58.375366] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:0b0b0b0b cdw11:0b0b0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.336 [2024-11-18 14:20:58.375381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.337 [2024-11-18 14:20:58.375493] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:f4f4f4f4 cdw11:f4f40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.337 [2024-11-18 14:20:58.375511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:02.337 #31 NEW cov: 12453 ft: 14386 corp: 6/100b lim: 35 exec/s: 0 rss: 72Mb L: 33/33 MS: 1 ChangeBinInt- 00:08:02.337 [2024-11-18 14:20:58.445234] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.337 [2024-11-18 14:20:58.445261] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.337 [2024-11-18 14:20:58.445383] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:0a10ffff cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.337 [2024-11-18 14:20:58.445411] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.337 [2024-11-18 14:20:58.445532] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:f4f40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.337 [2024-11-18 14:20:58.445553] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.337 [2024-11-18 14:20:58.445678] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:f4f4f4f4 cdw11:f4f40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.337 [2024-11-18 14:20:58.445696] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:02.597 #32 NEW cov: 12453 ft: 14439 corp: 7/133b lim: 35 exec/s: 0 rss: 72Mb L: 33/33 MS: 1 CMP- DE: "\020\000\000\000\000\000\000\000"- 00:08:02.597 [2024-11-18 14:20:58.495423] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.597 [2024-11-18 14:20:58.495449] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.597 [2024-11-18 14:20:58.495573] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:0a10ffff cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.597 [2024-11-18 14:20:58.495591] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.597 [2024-11-18 14:20:58.495713] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:f4f40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.597 [2024-11-18 14:20:58.495731] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.597 [2024-11-18 14:20:58.495850] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:00000200 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.597 [2024-11-18 14:20:58.495868] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:02.597 #33 NEW cov: 12453 ft: 14495 corp: 8/166b lim: 35 exec/s: 0 rss: 72Mb L: 33/33 MS: 1 PersAutoDict- DE: "\002\000\000\000\000\000\000\000"- 00:08:02.597 [2024-11-18 14:20:58.565664] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.597 [2024-11-18 14:20:58.565692] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.597 [2024-11-18 14:20:58.565813] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:0af4ffff cdw11:f4f40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.597 [2024-11-18 14:20:58.565831] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.597 [2024-11-18 14:20:58.565958] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:f4f4f4f4 cdw11:f4f40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.597 [2024-11-18 14:20:58.565973] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.597 [2024-11-18 14:20:58.566095] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:f4f4f4f4 cdw11:f4f40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.597 [2024-11-18 14:20:58.566110] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:02.597 #34 NEW cov: 12453 ft: 14554 corp: 9/199b lim: 35 exec/s: 0 rss: 72Mb L: 33/33 MS: 1 CopyPart- 00:08:02.597 [2024-11-18 14:20:58.616189] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.597 [2024-11-18 14:20:58.616214] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.597 [2024-11-18 14:20:58.616335] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:0a100000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.597 [2024-11-18 14:20:58.616352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.597 [2024-11-18 14:20:58.616475] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.597 [2024-11-18 14:20:58.616493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.597 [2024-11-18 14:20:58.616619] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:0200f4f4 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.597 [2024-11-18 14:20:58.616638] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:02.597 [2024-11-18 14:20:58.616755] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:8 nsid:0 cdw10:00f40000 cdw11:f4f40000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.597 [2024-11-18 14:20:58.616773] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:02.597 #35 NEW cov: 12453 ft: 14648 corp: 10/234b lim: 35 exec/s: 0 rss: 72Mb L: 35/35 MS: 1 CopyPart- 00:08:02.597 [2024-11-18 14:20:58.685210] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.597 [2024-11-18 14:20:58.685236] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.597 #36 NEW cov: 12453 ft: 14699 corp: 11/246b lim: 35 exec/s: 0 rss: 72Mb L: 12/35 MS: 1 CrossOver- 00:08:02.858 [2024-11-18 14:20:58.735560] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:02002601 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.858 [2024-11-18 14:20:58.735587] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.858 [2024-11-18 14:20:58.735718] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:9f000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.858 [2024-11-18 14:20:58.735736] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.858 NEW_FUNC[1/1]: 0x1c52b68 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:02.858 #37 NEW cov: 12476 ft: 14967 corp: 12/263b lim: 35 exec/s: 0 rss: 73Mb L: 17/35 MS: 1 CopyPart- 00:08:02.858 [2024-11-18 14:20:58.805530] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0000ff10 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.858 [2024-11-18 14:20:58.805559] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.858 #38 NEW cov: 12476 ft: 14988 corp: 13/274b lim: 35 exec/s: 0 rss: 73Mb L: 11/35 MS: 1 PersAutoDict- DE: "\020\000\000\000\000\000\000\000"- 00:08:02.858 [2024-11-18 14:20:58.856495] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.858 [2024-11-18 14:20:58.856524] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.858 [2024-11-18 14:20:58.856651] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:0af4ffff cdw11:f4f40000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.858 [2024-11-18 14:20:58.856668] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.858 [2024-11-18 14:20:58.856793] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:0b0b0b0b cdw11:0b0b0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.858 [2024-11-18 14:20:58.856813] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.858 [2024-11-18 14:20:58.856932] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:f4f4f4f4 cdw11:f4f40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.858 [2024-11-18 14:20:58.856951] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:02.858 #39 NEW cov: 12476 ft: 15011 corp: 14/307b lim: 35 exec/s: 39 rss: 73Mb L: 33/35 MS: 1 ShuffleBytes- 00:08:02.858 [2024-11-18 14:20:58.927061] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffb4ffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.858 [2024-11-18 14:20:58.927088] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.858 [2024-11-18 14:20:58.927217] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:0a100000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.858 [2024-11-18 14:20:58.927234] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.858 [2024-11-18 14:20:58.927348] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.858 [2024-11-18 14:20:58.927365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.858 [2024-11-18 14:20:58.927488] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:0200f4f4 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.858 [2024-11-18 14:20:58.927506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:02.858 [2024-11-18 14:20:58.927633] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:8 nsid:0 cdw10:00f40000 cdw11:f4f40000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.858 [2024-11-18 14:20:58.927651] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:02.858 #40 NEW cov: 12476 ft: 15033 corp: 15/342b lim: 35 exec/s: 40 rss: 73Mb L: 35/35 MS: 1 ChangeByte- 00:08:03.118 [2024-11-18 14:20:58.997057] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.118 [2024-11-18 14:20:58.997084] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.118 [2024-11-18 14:20:58.997211] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:0a10ffff cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.118 [2024-11-18 14:20:58.997228] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.118 [2024-11-18 14:20:58.997346] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:f4f40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.118 [2024-11-18 14:20:58.997361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.119 [2024-11-18 14:20:58.997477] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:f4f4f4f4 cdw11:f4280003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.119 [2024-11-18 14:20:58.997493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:03.119 #41 NEW cov: 12476 ft: 15064 corp: 16/375b lim: 35 exec/s: 41 rss: 73Mb L: 33/35 MS: 1 ChangeByte- 00:08:03.119 [2024-11-18 14:20:59.047146] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.119 [2024-11-18 14:20:59.047175] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.119 [2024-11-18 14:20:59.047291] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:0affffff cdw11:10000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.119 [2024-11-18 14:20:59.047307] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.119 [2024-11-18 14:20:59.047422] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00f40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.119 [2024-11-18 14:20:59.047440] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.119 [2024-11-18 14:20:59.047552] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:f4f4f4f4 cdw11:f4f40000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.119 [2024-11-18 14:20:59.047569] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:03.119 #42 NEW cov: 12476 ft: 15095 corp: 17/409b lim: 35 exec/s: 42 rss: 73Mb L: 34/35 MS: 1 CopyPart- 00:08:03.119 [2024-11-18 14:20:59.117363] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.119 [2024-11-18 14:20:59.117392] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.119 [2024-11-18 14:20:59.117504] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:0a10ffff cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.119 [2024-11-18 14:20:59.117521] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.119 [2024-11-18 14:20:59.117651] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:f4f40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.119 [2024-11-18 14:20:59.117667] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.119 [2024-11-18 14:20:59.117785] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:00000200 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.119 [2024-11-18 14:20:59.117802] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:03.119 #43 NEW cov: 12476 ft: 15127 corp: 18/442b lim: 35 exec/s: 43 rss: 73Mb L: 33/35 MS: 1 CopyPart- 00:08:03.119 [2024-11-18 14:20:59.167817] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff006a cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.119 [2024-11-18 14:20:59.167845] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.119 [2024-11-18 14:20:59.167968] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:0af40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.119 [2024-11-18 14:20:59.167985] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.119 [2024-11-18 14:20:59.168103] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:f4f4f4f4 cdw11:f4f40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.119 [2024-11-18 14:20:59.168119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.119 [2024-11-18 14:20:59.168239] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:f4f4f4f4 cdw11:f4f40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.119 [2024-11-18 14:20:59.168259] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:03.119 [2024-11-18 14:20:59.168374] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:8 nsid:0 cdw10:f4f4f4f4 cdw11:00f40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.119 [2024-11-18 14:20:59.168392] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:03.119 #48 NEW cov: 12476 ft: 15137 corp: 19/477b lim: 35 exec/s: 48 rss: 73Mb L: 35/35 MS: 5 EraseBytes-ShuffleBytes-ChangeBinInt-ChangeByte-CrossOver- 00:08:03.119 [2024-11-18 14:20:59.227330] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:300b0b0a cdw11:0a260000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.119 [2024-11-18 14:20:59.227358] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.119 [2024-11-18 14:20:59.227478] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000200 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.119 [2024-11-18 14:20:59.227495] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.119 [2024-11-18 14:20:59.227612] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:9f000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.119 [2024-11-18 14:20:59.227629] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.379 #53 NEW cov: 12476 ft: 15387 corp: 20/500b lim: 35 exec/s: 53 rss: 73Mb L: 23/35 MS: 5 CopyPart-ChangeBit-InsertByte-CopyPart-CrossOver- 00:08:03.379 [2024-11-18 14:20:59.277591] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:300b0b0a cdw11:0a260000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.379 [2024-11-18 14:20:59.277618] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.379 [2024-11-18 14:20:59.277738] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000200 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.379 [2024-11-18 14:20:59.277755] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.379 [2024-11-18 14:20:59.277871] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.379 [2024-11-18 14:20:59.277887] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.379 #54 NEW cov: 12476 ft: 15401 corp: 21/527b lim: 35 exec/s: 54 rss: 73Mb L: 27/35 MS: 1 InsertRepeatedBytes- 00:08:03.379 [2024-11-18 14:20:59.348245] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.379 [2024-11-18 14:20:59.348272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.379 [2024-11-18 14:20:59.348387] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:0a100000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.379 [2024-11-18 14:20:59.348404] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.379 [2024-11-18 14:20:59.348517] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.379 [2024-11-18 14:20:59.348534] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.379 [2024-11-18 14:20:59.348653] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:0200f4f4 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.379 [2024-11-18 14:20:59.348674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:03.379 [2024-11-18 14:20:59.348797] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:8 nsid:0 cdw10:00f40000 cdw11:f4f40000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.379 [2024-11-18 14:20:59.348815] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:03.379 #55 NEW cov: 12476 ft: 15442 corp: 22/562b lim: 35 exec/s: 55 rss: 73Mb L: 35/35 MS: 1 ShuffleBytes- 00:08:03.379 [2024-11-18 14:20:59.398125] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.379 [2024-11-18 14:20:59.398153] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.379 [2024-11-18 14:20:59.398284] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:0af4ffff cdw11:f4f40000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.379 [2024-11-18 14:20:59.398301] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.379 [2024-11-18 14:20:59.398418] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:0b0b0b0b cdw11:0b0b0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.379 [2024-11-18 14:20:59.398434] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.379 [2024-11-18 14:20:59.398554] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:0afff4ff cdw11:ffff0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.379 [2024-11-18 14:20:59.398573] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:03.379 #56 NEW cov: 12476 ft: 15476 corp: 23/595b lim: 35 exec/s: 56 rss: 73Mb L: 33/35 MS: 1 CrossOver- 00:08:03.379 [2024-11-18 14:20:59.467522] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0200ffff cdw11:00000003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.379 [2024-11-18 14:20:59.467555] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.640 #57 NEW cov: 12476 ft: 15487 corp: 24/607b lim: 35 exec/s: 57 rss: 73Mb L: 12/35 MS: 1 ChangeBinInt- 00:08:03.640 [2024-11-18 14:20:59.538553] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.640 [2024-11-18 14:20:59.538580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.640 [2024-11-18 14:20:59.538703] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:0af4ffff cdw11:f4f40000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.640 [2024-11-18 14:20:59.538719] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.640 [2024-11-18 14:20:59.538843] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:0b0b0b0b cdw11:0b0b0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.640 [2024-11-18 14:20:59.538860] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.640 [2024-11-18 14:20:59.538978] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:60f4f4f4 cdw11:f4f40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.640 [2024-11-18 14:20:59.538995] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:03.640 #58 NEW cov: 12476 ft: 15502 corp: 25/640b lim: 35 exec/s: 58 rss: 73Mb L: 33/35 MS: 1 ChangeByte- 00:08:03.640 [2024-11-18 14:20:59.588411] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:02002601 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.640 [2024-11-18 14:20:59.588437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.640 [2024-11-18 14:20:59.588561] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:9f000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.640 [2024-11-18 14:20:59.588577] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.640 [2024-11-18 14:20:59.588706] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.640 [2024-11-18 14:20:59.588722] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.640 #59 NEW cov: 12476 ft: 15527 corp: 26/667b lim: 35 exec/s: 59 rss: 73Mb L: 27/35 MS: 1 InsertRepeatedBytes- 00:08:03.640 [2024-11-18 14:20:59.658138] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:fffffffd cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.640 [2024-11-18 14:20:59.658166] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.640 #60 NEW cov: 12476 ft: 15549 corp: 27/678b lim: 35 exec/s: 60 rss: 73Mb L: 11/35 MS: 1 ChangeBit- 00:08:03.640 [2024-11-18 14:20:59.698668] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ff000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.640 [2024-11-18 14:20:59.698694] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.640 [2024-11-18 14:20:59.698817] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:f4f40003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.640 [2024-11-18 14:20:59.698835] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.640 [2024-11-18 14:20:59.698958] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:f4f4f4f4 cdw11:f4280003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.640 [2024-11-18 14:20:59.698974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.640 #61 NEW cov: 12476 ft: 15575 corp: 28/704b lim: 35 exec/s: 61 rss: 73Mb L: 26/35 MS: 1 EraseBytes- 00:08:03.640 [2024-11-18 14:20:59.748295] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ff7f0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.640 [2024-11-18 14:20:59.748321] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.901 #62 NEW cov: 12476 ft: 15585 corp: 29/715b lim: 35 exec/s: 62 rss: 73Mb L: 11/35 MS: 1 ChangeBit- 00:08:03.901 [2024-11-18 14:20:59.799516] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.901 [2024-11-18 14:20:59.799542] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.901 [2024-11-18 14:20:59.799689] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:0a100000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.901 [2024-11-18 14:20:59.799705] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.901 [2024-11-18 14:20:59.799828] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00002f00 cdw11:00000003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.901 [2024-11-18 14:20:59.799855] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.901 [2024-11-18 14:20:59.799980] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:0200f4f4 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.901 [2024-11-18 14:20:59.799999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:03.901 [2024-11-18 14:20:59.800120] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:8 nsid:0 cdw10:00f40000 cdw11:f4f40000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.901 [2024-11-18 14:20:59.800138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:03.901 #63 NEW cov: 12476 ft: 15595 corp: 30/750b lim: 35 exec/s: 63 rss: 73Mb L: 35/35 MS: 1 ChangeByte- 00:08:03.901 [2024-11-18 14:20:59.848702] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00008000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.901 [2024-11-18 14:20:59.848732] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.901 #65 NEW cov: 12476 ft: 15598 corp: 31/759b lim: 35 exec/s: 32 rss: 73Mb L: 9/35 MS: 2 ChangeBit-CMP- DE: "\200\000\000\000\000\000\000\000"- 00:08:03.901 #65 DONE cov: 12476 ft: 15598 corp: 31/759b lim: 35 exec/s: 32 rss: 73Mb 00:08:03.901 ###### Recommended dictionary. ###### 00:08:03.901 "\002\000\000\000\000\000\000\000" # Uses: 1 00:08:03.901 "\020\000\000\000\000\000\000\000" # Uses: 1 00:08:03.901 "\200\000\000\000\000\000\000\000" # Uses: 0 00:08:03.901 ###### End of recommended dictionary. ###### 00:08:03.901 Done 65 runs in 2 second(s) 00:08:03.901 14:20:59 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_4.conf /var/tmp/suppress_nvmf_fuzz 00:08:03.901 14:20:59 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:03.901 14:20:59 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:03.901 14:20:59 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 5 1 0x1 00:08:03.901 14:20:59 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=5 00:08:03.901 14:20:59 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:03.901 14:20:59 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:03.901 14:20:59 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:08:03.901 14:20:59 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_5.conf 00:08:03.901 14:20:59 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:03.901 14:20:59 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:03.901 14:20:59 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 5 00:08:03.901 14:20:59 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4405 00:08:03.901 14:20:59 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:08:03.901 14:20:59 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4405' 00:08:03.901 14:20:59 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4405"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:03.901 14:20:59 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:03.901 14:20:59 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:03.901 14:20:59 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4405' -c /tmp/fuzz_json_5.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 -Z 5 00:08:03.902 [2024-11-18 14:21:00.013806] Starting SPDK v25.01-pre git sha1 d47eb51c9 / DPDK 23.11.0 initialization... 00:08:03.902 [2024-11-18 14:21:00.013874] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid324208 ] 00:08:04.161 [2024-11-18 14:21:00.223209] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:04.161 [2024-11-18 14:21:00.236589] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:04.421 [2024-11-18 14:21:00.289260] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:04.421 [2024-11-18 14:21:00.305584] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4405 *** 00:08:04.421 INFO: Running with entropic power schedule (0xFF, 100). 00:08:04.421 INFO: Seed: 2971741676 00:08:04.421 INFO: Loaded 1 modules (387495 inline 8-bit counters): 387495 [0x2acbe4c, 0x2b2a7f3), 00:08:04.421 INFO: Loaded 1 PC tables (387495 PCs): 387495 [0x2b2a7f8,0x3114268), 00:08:04.421 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:08:04.421 INFO: A corpus is not provided, starting from an empty corpus 00:08:04.421 #2 INITED exec/s: 0 rss: 65Mb 00:08:04.421 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:04.421 This may also happen if the target rejected all inputs we tried so far 00:08:04.421 [2024-11-18 14:21:00.361068] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:008b2b0a cdw11:8ba60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.421 [2024-11-18 14:21:00.361095] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.681 NEW_FUNC[1/716]: 0x461978 in fuzz_admin_create_io_submission_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:142 00:08:04.681 NEW_FUNC[2/716]: 0x495e28 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:04.681 #5 NEW cov: 12260 ft: 12252 corp: 2/11b lim: 45 exec/s: 0 rss: 72Mb L: 10/10 MS: 3 InsertByte-CopyPart-CMP- DE: "\000\213\213\246\3561\222\300"- 00:08:04.681 [2024-11-18 14:21:00.702227] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:e2e2e2e2 cdw11:e2e20007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.681 [2024-11-18 14:21:00.702278] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.681 #7 NEW cov: 12373 ft: 12929 corp: 3/25b lim: 45 exec/s: 0 rss: 72Mb L: 14/14 MS: 2 ChangeBit-InsertRepeatedBytes- 00:08:04.681 [2024-11-18 14:21:00.751944] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:e2e2e2e2 cdw11:e2fd0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.681 [2024-11-18 14:21:00.751970] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.681 #13 NEW cov: 12379 ft: 13257 corp: 4/42b lim: 45 exec/s: 0 rss: 72Mb L: 17/17 MS: 1 InsertRepeatedBytes- 00:08:04.942 [2024-11-18 14:21:00.812268] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:008b2b0a cdw11:8ba60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.942 [2024-11-18 14:21:00.812296] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.942 [2024-11-18 14:21:00.812353] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:e2e2e2e2 cdw11:e2e20007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.942 [2024-11-18 14:21:00.812368] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.942 #19 NEW cov: 12464 ft: 14368 corp: 5/62b lim: 45 exec/s: 0 rss: 72Mb L: 20/20 MS: 1 CrossOver- 00:08:04.942 [2024-11-18 14:21:00.872199] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:008b2b0a cdw11:8ba60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.942 [2024-11-18 14:21:00.872226] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.942 #20 NEW cov: 12464 ft: 14441 corp: 6/72b lim: 45 exec/s: 0 rss: 72Mb L: 10/20 MS: 1 ChangeByte- 00:08:04.942 [2024-11-18 14:21:00.912346] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:008b2b0a cdw11:ef8b0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.942 [2024-11-18 14:21:00.912371] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.942 #21 NEW cov: 12464 ft: 14516 corp: 7/83b lim: 45 exec/s: 0 rss: 72Mb L: 11/20 MS: 1 InsertByte- 00:08:04.942 [2024-11-18 14:21:00.972518] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:e2e2e2e2 cdw11:e2e20007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.942 [2024-11-18 14:21:00.972543] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.942 #22 NEW cov: 12464 ft: 14590 corp: 8/97b lim: 45 exec/s: 0 rss: 73Mb L: 14/20 MS: 1 CrossOver- 00:08:04.942 [2024-11-18 14:21:01.012623] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:005d2b0a cdw11:8ba60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.942 [2024-11-18 14:21:01.012649] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.942 #23 NEW cov: 12464 ft: 14647 corp: 9/107b lim: 45 exec/s: 0 rss: 73Mb L: 10/20 MS: 1 ChangeByte- 00:08:04.942 [2024-11-18 14:21:01.052877] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:008b2b0a cdw11:8ba60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.942 [2024-11-18 14:21:01.052903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.942 [2024-11-18 14:21:01.052959] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:8b8bc000 cdw11:a6ee0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.942 [2024-11-18 14:21:01.052973] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.202 #24 NEW cov: 12464 ft: 14684 corp: 10/125b lim: 45 exec/s: 0 rss: 73Mb L: 18/20 MS: 1 PersAutoDict- DE: "\000\213\213\246\3561\222\300"- 00:08:05.202 [2024-11-18 14:21:01.093185] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:008b2b0a cdw11:8ba60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.202 [2024-11-18 14:21:01.093211] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.202 [2024-11-18 14:21:01.093267] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:e2e2e2e2 cdw11:e2e20007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.202 [2024-11-18 14:21:01.093281] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.202 [2024-11-18 14:21:01.093337] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:8be20a00 cdw11:92ef0004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.202 [2024-11-18 14:21:01.093350] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.202 #25 NEW cov: 12464 ft: 14977 corp: 11/156b lim: 45 exec/s: 0 rss: 73Mb L: 31/31 MS: 1 CrossOver- 00:08:05.202 [2024-11-18 14:21:01.153187] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:008b2b0a cdw11:8ba60003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.202 [2024-11-18 14:21:01.153212] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.202 [2024-11-18 14:21:01.153267] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:008b92c0 cdw11:8ba60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.202 [2024-11-18 14:21:01.153281] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.202 #26 NEW cov: 12464 ft: 15006 corp: 12/175b lim: 45 exec/s: 0 rss: 73Mb L: 19/31 MS: 1 InsertByte- 00:08:05.202 [2024-11-18 14:21:01.213192] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:e2e2e2e2 cdw11:e2dc0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.202 [2024-11-18 14:21:01.213218] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.202 #27 NEW cov: 12464 ft: 15012 corp: 13/189b lim: 45 exec/s: 0 rss: 73Mb L: 14/31 MS: 1 ChangeBinInt- 00:08:05.202 [2024-11-18 14:21:01.253331] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00892b0a cdw11:ef8b0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.202 [2024-11-18 14:21:01.253356] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.202 NEW_FUNC[1/1]: 0x1c52b68 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:05.202 #28 NEW cov: 12487 ft: 15068 corp: 14/200b lim: 45 exec/s: 0 rss: 73Mb L: 11/31 MS: 1 ChangeBit- 00:08:05.202 [2024-11-18 14:21:01.313647] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:27270a27 cdw11:27270001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.202 [2024-11-18 14:21:01.313674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.202 [2024-11-18 14:21:01.313729] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:27272727 cdw11:27270001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.202 [2024-11-18 14:21:01.313743] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.461 #30 NEW cov: 12487 ft: 15096 corp: 15/224b lim: 45 exec/s: 0 rss: 73Mb L: 24/31 MS: 2 ShuffleBytes-InsertRepeatedBytes- 00:08:05.461 [2024-11-18 14:21:01.353583] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:e2e2e2e2 cdw11:e2e20007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.461 [2024-11-18 14:21:01.353608] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.462 #31 NEW cov: 12487 ft: 15146 corp: 16/236b lim: 45 exec/s: 31 rss: 73Mb L: 12/31 MS: 1 EraseBytes- 00:08:05.462 [2024-11-18 14:21:01.393713] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ee31e2e2 cdw11:92e20007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.462 [2024-11-18 14:21:01.393738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.462 #32 NEW cov: 12487 ft: 15151 corp: 17/251b lim: 45 exec/s: 32 rss: 73Mb L: 15/31 MS: 1 CrossOver- 00:08:05.462 [2024-11-18 14:21:01.453863] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ee31e2e2 cdw11:92e20007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.462 [2024-11-18 14:21:01.453887] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.462 #33 NEW cov: 12487 ft: 15198 corp: 18/262b lim: 45 exec/s: 33 rss: 73Mb L: 11/31 MS: 1 EraseBytes- 00:08:05.462 [2024-11-18 14:21:01.514070] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:008b2e0a cdw11:8ba60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.462 [2024-11-18 14:21:01.514094] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.462 #34 NEW cov: 12487 ft: 15209 corp: 19/272b lim: 45 exec/s: 34 rss: 73Mb L: 10/31 MS: 1 ChangeByte- 00:08:05.462 [2024-11-18 14:21:01.554155] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:008b2b0a cdw11:ef8b0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.462 [2024-11-18 14:21:01.554180] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.462 #35 NEW cov: 12487 ft: 15226 corp: 20/283b lim: 45 exec/s: 35 rss: 73Mb L: 11/31 MS: 1 ChangeBit- 00:08:05.722 [2024-11-18 14:21:01.594265] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:e2e2e2e2 cdw11:e2e20007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.722 [2024-11-18 14:21:01.594290] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.722 #36 NEW cov: 12487 ft: 15236 corp: 21/295b lim: 45 exec/s: 36 rss: 73Mb L: 12/31 MS: 1 ShuffleBytes- 00:08:05.722 [2024-11-18 14:21:01.634423] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:2b0a92c0 cdw11:008b0004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.722 [2024-11-18 14:21:01.634449] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.722 #37 NEW cov: 12487 ft: 15256 corp: 22/307b lim: 45 exec/s: 37 rss: 73Mb L: 12/31 MS: 1 CopyPart- 00:08:05.722 [2024-11-18 14:21:01.675041] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:008b2b0a cdw11:8ba60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.722 [2024-11-18 14:21:01.675066] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.722 [2024-11-18 14:21:01.675121] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:e2e2e2e2 cdw11:008b0004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.722 [2024-11-18 14:21:01.675134] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.722 [2024-11-18 14:21:01.675187] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:c0e23192 cdw11:e2e20007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.722 [2024-11-18 14:21:01.675200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.722 [2024-11-18 14:21:01.675257] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:e292008b cdw11:ef8b0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.722 [2024-11-18 14:21:01.675270] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.722 #38 NEW cov: 12487 ft: 15597 corp: 23/346b lim: 45 exec/s: 38 rss: 73Mb L: 39/39 MS: 1 PersAutoDict- DE: "\000\213\213\246\3561\222\300"- 00:08:05.722 [2024-11-18 14:21:01.734706] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:e2e2e2e2 cdw11:e2240004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.722 [2024-11-18 14:21:01.734731] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.722 #39 NEW cov: 12487 ft: 15621 corp: 24/363b lim: 45 exec/s: 39 rss: 73Mb L: 17/39 MS: 1 CrossOver- 00:08:05.722 [2024-11-18 14:21:01.774837] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:e2e2e2e2 cdw11:e2e20007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.722 [2024-11-18 14:21:01.774867] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.722 [2024-11-18 14:21:01.835010] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:0a00e22b cdw11:5d8b0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.722 [2024-11-18 14:21:01.835034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.982 #41 NEW cov: 12487 ft: 15644 corp: 25/375b lim: 45 exec/s: 41 rss: 73Mb L: 12/39 MS: 2 ChangeBinInt-CrossOver- 00:08:05.982 [2024-11-18 14:21:01.875459] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:27270a27 cdw11:27270001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.982 [2024-11-18 14:21:01.875484] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.982 [2024-11-18 14:21:01.875546] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:27272727 cdw11:27000004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.982 [2024-11-18 14:21:01.875564] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.982 [2024-11-18 14:21:01.875619] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:92c0ee31 cdw11:27270001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.982 [2024-11-18 14:21:01.875632] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.982 #42 NEW cov: 12487 ft: 15653 corp: 26/407b lim: 45 exec/s: 42 rss: 74Mb L: 32/39 MS: 1 PersAutoDict- DE: "\000\213\213\246\3561\222\300"- 00:08:05.982 [2024-11-18 14:21:01.935248] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:008b2b0a cdw11:ef8b0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.982 [2024-11-18 14:21:01.935273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.982 #43 NEW cov: 12487 ft: 15693 corp: 27/418b lim: 45 exec/s: 43 rss: 74Mb L: 11/39 MS: 1 ChangeBinInt- 00:08:05.982 [2024-11-18 14:21:01.975386] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:008b2b0a cdw11:8ba60001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.982 [2024-11-18 14:21:01.975410] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.982 #45 NEW cov: 12487 ft: 15710 corp: 28/433b lim: 45 exec/s: 45 rss: 74Mb L: 15/39 MS: 2 CrossOver-CopyPart- 00:08:05.982 [2024-11-18 14:21:02.015441] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:e2e2e2e2 cdw11:e2e20007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.982 [2024-11-18 14:21:02.015466] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.982 #46 NEW cov: 12487 ft: 15723 corp: 29/447b lim: 45 exec/s: 46 rss: 74Mb L: 14/39 MS: 1 CopyPart- 00:08:05.982 [2024-11-18 14:21:02.075917] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:e2e2e2e2 cdw11:e2fd0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.982 [2024-11-18 14:21:02.075941] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.983 [2024-11-18 14:21:02.075998] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.983 [2024-11-18 14:21:02.076012] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.983 [2024-11-18 14:21:02.076082] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.983 [2024-11-18 14:21:02.076096] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.243 #47 NEW cov: 12487 ft: 15734 corp: 30/482b lim: 45 exec/s: 47 rss: 74Mb L: 35/39 MS: 1 InsertRepeatedBytes- 00:08:06.243 [2024-11-18 14:21:02.135797] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:0a00e22b cdw11:8ba60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.243 [2024-11-18 14:21:02.135823] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.243 #48 NEW cov: 12487 ft: 15761 corp: 31/493b lim: 45 exec/s: 48 rss: 74Mb L: 11/39 MS: 1 EraseBytes- 00:08:06.243 [2024-11-18 14:21:02.196273] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:27270a27 cdw11:27270001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.243 [2024-11-18 14:21:02.196298] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.243 [2024-11-18 14:21:02.196357] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:27272727 cdw11:27000004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.243 [2024-11-18 14:21:02.196371] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.243 [2024-11-18 14:21:02.196426] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:92c0ee31 cdw11:27270001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.243 [2024-11-18 14:21:02.196439] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.243 #49 NEW cov: 12487 ft: 15800 corp: 32/525b lim: 45 exec/s: 49 rss: 74Mb L: 32/39 MS: 1 ChangeByte- 00:08:06.243 [2024-11-18 14:21:02.256131] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:e2e2e2e2 cdw11:e2e20007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.243 [2024-11-18 14:21:02.256156] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.243 #50 NEW cov: 12487 ft: 15854 corp: 33/537b lim: 45 exec/s: 50 rss: 74Mb L: 12/39 MS: 1 CopyPart- 00:08:06.243 [2024-11-18 14:21:02.296232] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:e2e2e2e2 cdw11:e2160000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.243 [2024-11-18 14:21:02.296259] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.243 #51 NEW cov: 12487 ft: 15856 corp: 34/551b lim: 45 exec/s: 51 rss: 74Mb L: 14/39 MS: 1 ChangeBinInt- 00:08:06.243 [2024-11-18 14:21:02.356398] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:008b2b0a cdw11:8ba60007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:06.243 [2024-11-18 14:21:02.356423] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.503 #52 NEW cov: 12487 ft: 15864 corp: 35/561b lim: 45 exec/s: 26 rss: 74Mb L: 10/39 MS: 1 CopyPart- 00:08:06.503 #52 DONE cov: 12487 ft: 15864 corp: 35/561b lim: 45 exec/s: 26 rss: 74Mb 00:08:06.503 ###### Recommended dictionary. ###### 00:08:06.503 "\000\213\213\246\3561\222\300" # Uses: 3 00:08:06.503 ###### End of recommended dictionary. ###### 00:08:06.503 Done 52 runs in 2 second(s) 00:08:06.503 14:21:02 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_5.conf /var/tmp/suppress_nvmf_fuzz 00:08:06.503 14:21:02 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:06.503 14:21:02 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:06.503 14:21:02 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 6 1 0x1 00:08:06.503 14:21:02 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=6 00:08:06.503 14:21:02 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:06.503 14:21:02 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:06.503 14:21:02 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:08:06.503 14:21:02 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_6.conf 00:08:06.503 14:21:02 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:06.503 14:21:02 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:06.503 14:21:02 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 6 00:08:06.503 14:21:02 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4406 00:08:06.503 14:21:02 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:08:06.503 14:21:02 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4406' 00:08:06.503 14:21:02 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4406"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:06.503 14:21:02 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:06.503 14:21:02 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:06.503 14:21:02 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4406' -c /tmp/fuzz_json_6.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 -Z 6 00:08:06.503 [2024-11-18 14:21:02.516056] Starting SPDK v25.01-pre git sha1 d47eb51c9 / DPDK 23.11.0 initialization... 00:08:06.503 [2024-11-18 14:21:02.516122] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid324800 ] 00:08:06.763 [2024-11-18 14:21:02.716475] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:06.763 [2024-11-18 14:21:02.729415] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:06.763 [2024-11-18 14:21:02.781810] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:06.763 [2024-11-18 14:21:02.798168] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4406 *** 00:08:06.763 INFO: Running with entropic power schedule (0xFF, 100). 00:08:06.763 INFO: Seed: 1169760500 00:08:06.763 INFO: Loaded 1 modules (387495 inline 8-bit counters): 387495 [0x2acbe4c, 0x2b2a7f3), 00:08:06.763 INFO: Loaded 1 PC tables (387495 PCs): 387495 [0x2b2a7f8,0x3114268), 00:08:06.763 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:08:06.763 INFO: A corpus is not provided, starting from an empty corpus 00:08:06.763 #2 INITED exec/s: 0 rss: 64Mb 00:08:06.763 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:06.763 This may also happen if the target rejected all inputs we tried so far 00:08:06.763 [2024-11-18 14:21:02.864200] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000ab5 cdw11:00000000 00:08:06.763 [2024-11-18 14:21:02.864236] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.282 NEW_FUNC[1/714]: 0x464188 in fuzz_admin_delete_io_completion_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:161 00:08:07.282 NEW_FUNC[2/714]: 0x495e28 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:07.282 #3 NEW cov: 12177 ft: 12178 corp: 2/3b lim: 10 exec/s: 0 rss: 72Mb L: 2/2 MS: 1 InsertByte- 00:08:07.282 [2024-11-18 14:21:03.205167] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000b50a cdw11:00000000 00:08:07.282 [2024-11-18 14:21:03.205205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.282 #4 NEW cov: 12290 ft: 12700 corp: 3/5b lim: 10 exec/s: 0 rss: 72Mb L: 2/2 MS: 1 ShuffleBytes- 00:08:07.282 [2024-11-18 14:21:03.275407] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a3a cdw11:00000000 00:08:07.283 [2024-11-18 14:21:03.275439] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.283 #5 NEW cov: 12296 ft: 13008 corp: 4/7b lim: 10 exec/s: 0 rss: 72Mb L: 2/2 MS: 1 InsertByte- 00:08:07.283 [2024-11-18 14:21:03.325479] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000bb5 cdw11:00000000 00:08:07.283 [2024-11-18 14:21:03.325507] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.283 #6 NEW cov: 12381 ft: 13287 corp: 5/9b lim: 10 exec/s: 0 rss: 72Mb L: 2/2 MS: 1 ChangeBit- 00:08:07.283 [2024-11-18 14:21:03.375658] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000b50a cdw11:00000000 00:08:07.283 [2024-11-18 14:21:03.375688] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.543 #7 NEW cov: 12381 ft: 13377 corp: 6/12b lim: 10 exec/s: 0 rss: 72Mb L: 3/3 MS: 1 CopyPart- 00:08:07.543 [2024-11-18 14:21:03.445821] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000b50a cdw11:00000000 00:08:07.543 [2024-11-18 14:21:03.445848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.543 #8 NEW cov: 12381 ft: 13515 corp: 7/15b lim: 10 exec/s: 0 rss: 72Mb L: 3/3 MS: 1 ShuffleBytes- 00:08:07.543 [2024-11-18 14:21:03.516335] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000b50a cdw11:00000000 00:08:07.543 [2024-11-18 14:21:03.516363] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.543 [2024-11-18 14:21:03.516477] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000b50a cdw11:00000000 00:08:07.543 [2024-11-18 14:21:03.516494] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.543 #9 NEW cov: 12381 ft: 13765 corp: 8/20b lim: 10 exec/s: 0 rss: 72Mb L: 5/5 MS: 1 CopyPart- 00:08:07.543 [2024-11-18 14:21:03.566266] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000cd0a cdw11:00000000 00:08:07.543 [2024-11-18 14:21:03.566294] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.543 #10 NEW cov: 12381 ft: 13795 corp: 9/23b lim: 10 exec/s: 0 rss: 72Mb L: 3/5 MS: 1 InsertByte- 00:08:07.543 [2024-11-18 14:21:03.637212] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000b5bc cdw11:00000000 00:08:07.543 [2024-11-18 14:21:03.637241] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.543 [2024-11-18 14:21:03.637361] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000bcbc cdw11:00000000 00:08:07.543 [2024-11-18 14:21:03.637378] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.543 [2024-11-18 14:21:03.637490] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000bc0a cdw11:00000000 00:08:07.543 [2024-11-18 14:21:03.637507] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.543 [2024-11-18 14:21:03.637628] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000b50a cdw11:00000000 00:08:07.543 [2024-11-18 14:21:03.637644] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.802 #11 NEW cov: 12381 ft: 14074 corp: 10/32b lim: 10 exec/s: 0 rss: 72Mb L: 9/9 MS: 1 InsertRepeatedBytes- 00:08:07.802 [2024-11-18 14:21:03.706941] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:000021b5 cdw11:00000000 00:08:07.802 [2024-11-18 14:21:03.706970] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.802 [2024-11-18 14:21:03.707090] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:07.802 [2024-11-18 14:21:03.707108] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.802 NEW_FUNC[1/1]: 0x1c52b68 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:07.802 #12 NEW cov: 12404 ft: 14123 corp: 11/36b lim: 10 exec/s: 0 rss: 73Mb L: 4/9 MS: 1 InsertByte- 00:08:07.802 [2024-11-18 14:21:03.777573] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000abab cdw11:00000000 00:08:07.802 [2024-11-18 14:21:03.777623] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.802 [2024-11-18 14:21:03.777745] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000abab cdw11:00000000 00:08:07.802 [2024-11-18 14:21:03.777763] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.802 [2024-11-18 14:21:03.777881] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000abab cdw11:00000000 00:08:07.802 [2024-11-18 14:21:03.777900] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.803 [2024-11-18 14:21:03.778017] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000abb1 cdw11:00000000 00:08:07.803 [2024-11-18 14:21:03.778032] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.803 #15 NEW cov: 12404 ft: 14134 corp: 12/44b lim: 10 exec/s: 0 rss: 73Mb L: 8/9 MS: 3 EraseBytes-ChangeByte-InsertRepeatedBytes- 00:08:07.803 [2024-11-18 14:21:03.827333] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000b50a cdw11:00000000 00:08:07.803 [2024-11-18 14:21:03.827360] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.803 [2024-11-18 14:21:03.827479] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:07.803 [2024-11-18 14:21:03.827496] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.803 #16 NEW cov: 12404 ft: 14140 corp: 13/48b lim: 10 exec/s: 16 rss: 73Mb L: 4/9 MS: 1 CrossOver- 00:08:07.803 [2024-11-18 14:21:03.877242] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000b50a cdw11:00000000 00:08:07.803 [2024-11-18 14:21:03.877271] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.803 #17 NEW cov: 12404 ft: 14200 corp: 14/51b lim: 10 exec/s: 17 rss: 73Mb L: 3/9 MS: 1 CrossOver- 00:08:08.062 [2024-11-18 14:21:03.947399] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000b10a cdw11:00000000 00:08:08.062 [2024-11-18 14:21:03.947428] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.062 #18 NEW cov: 12404 ft: 14227 corp: 15/53b lim: 10 exec/s: 18 rss: 73Mb L: 2/9 MS: 1 ChangeBit- 00:08:08.062 [2024-11-18 14:21:03.997798] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000bb5 cdw11:00000000 00:08:08.062 [2024-11-18 14:21:03.997826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.063 [2024-11-18 14:21:03.997941] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000bb5 cdw11:00000000 00:08:08.063 [2024-11-18 14:21:03.997961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.063 #19 NEW cov: 12404 ft: 14264 corp: 16/57b lim: 10 exec/s: 19 rss: 73Mb L: 4/9 MS: 1 CopyPart- 00:08:08.063 [2024-11-18 14:21:04.068064] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000b50a cdw11:00000000 00:08:08.063 [2024-11-18 14:21:04.068092] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.063 [2024-11-18 14:21:04.068210] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:08.063 [2024-11-18 14:21:04.068230] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.063 #20 NEW cov: 12404 ft: 14284 corp: 17/61b lim: 10 exec/s: 20 rss: 73Mb L: 4/9 MS: 1 CopyPart- 00:08:08.063 [2024-11-18 14:21:04.118662] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000b5bc cdw11:00000000 00:08:08.063 [2024-11-18 14:21:04.118690] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.063 [2024-11-18 14:21:04.118814] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000bcbc cdw11:00000000 00:08:08.063 [2024-11-18 14:21:04.118830] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.063 [2024-11-18 14:21:04.118947] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000bc0a cdw11:00000000 00:08:08.063 [2024-11-18 14:21:04.118967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.063 [2024-11-18 14:21:04.119092] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000b50a cdw11:00000000 00:08:08.063 [2024-11-18 14:21:04.119109] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:08.063 [2024-11-18 14:21:04.119230] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:0000690a cdw11:00000000 00:08:08.063 [2024-11-18 14:21:04.119247] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:08.063 #21 NEW cov: 12404 ft: 14336 corp: 18/71b lim: 10 exec/s: 21 rss: 73Mb L: 10/10 MS: 1 InsertByte- 00:08:08.063 [2024-11-18 14:21:04.188832] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000b7bc cdw11:00000000 00:08:08.063 [2024-11-18 14:21:04.188859] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.063 [2024-11-18 14:21:04.188980] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000bcbc cdw11:00000000 00:08:08.063 [2024-11-18 14:21:04.188998] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.063 [2024-11-18 14:21:04.189111] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000bc0a cdw11:00000000 00:08:08.063 [2024-11-18 14:21:04.189128] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.063 [2024-11-18 14:21:04.189243] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000b50a cdw11:00000000 00:08:08.063 [2024-11-18 14:21:04.189261] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:08.323 #22 NEW cov: 12404 ft: 14354 corp: 19/80b lim: 10 exec/s: 22 rss: 73Mb L: 9/10 MS: 1 ChangeBit- 00:08:08.323 [2024-11-18 14:21:04.238542] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000b500 cdw11:00000000 00:08:08.323 [2024-11-18 14:21:04.238572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.323 [2024-11-18 14:21:04.238687] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000000a cdw11:00000000 00:08:08.323 [2024-11-18 14:21:04.238703] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.323 #23 NEW cov: 12404 ft: 14400 corp: 20/85b lim: 10 exec/s: 23 rss: 73Mb L: 5/10 MS: 1 CMP- DE: "\000\000"- 00:08:08.323 [2024-11-18 14:21:04.288466] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:08:08.323 [2024-11-18 14:21:04.288494] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.323 #24 NEW cov: 12404 ft: 14526 corp: 21/87b lim: 10 exec/s: 24 rss: 73Mb L: 2/10 MS: 1 PersAutoDict- DE: "\000\000"- 00:08:08.323 [2024-11-18 14:21:04.359164] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000b50a cdw11:00000000 00:08:08.323 [2024-11-18 14:21:04.359192] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.323 [2024-11-18 14:21:04.359305] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00004040 cdw11:00000000 00:08:08.323 [2024-11-18 14:21:04.359325] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.323 [2024-11-18 14:21:04.359446] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000400a cdw11:00000000 00:08:08.323 [2024-11-18 14:21:04.359463] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.323 #25 NEW cov: 12404 ft: 14676 corp: 22/94b lim: 10 exec/s: 25 rss: 73Mb L: 7/10 MS: 1 InsertRepeatedBytes- 00:08:08.323 [2024-11-18 14:21:04.409567] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00006363 cdw11:00000000 00:08:08.323 [2024-11-18 14:21:04.409594] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.323 [2024-11-18 14:21:04.409714] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00006363 cdw11:00000000 00:08:08.323 [2024-11-18 14:21:04.409729] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.323 [2024-11-18 14:21:04.409843] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:000063b5 cdw11:00000000 00:08:08.323 [2024-11-18 14:21:04.409861] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.323 [2024-11-18 14:21:04.409973] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:08.323 [2024-11-18 14:21:04.409990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:08.583 #26 NEW cov: 12404 ft: 14705 corp: 23/102b lim: 10 exec/s: 26 rss: 73Mb L: 8/10 MS: 1 InsertRepeatedBytes- 00:08:08.583 [2024-11-18 14:21:04.479905] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000b5bc cdw11:00000000 00:08:08.583 [2024-11-18 14:21:04.479930] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.583 [2024-11-18 14:21:04.480046] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000bc00 cdw11:00000000 00:08:08.583 [2024-11-18 14:21:04.480064] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.583 [2024-11-18 14:21:04.480184] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000000a cdw11:00000000 00:08:08.583 [2024-11-18 14:21:04.480200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.583 [2024-11-18 14:21:04.480317] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000b50a cdw11:00000000 00:08:08.583 [2024-11-18 14:21:04.480333] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:08.583 [2024-11-18 14:21:04.480448] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:0000690a cdw11:00000000 00:08:08.583 [2024-11-18 14:21:04.480465] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:08.583 #27 NEW cov: 12404 ft: 14717 corp: 24/112b lim: 10 exec/s: 27 rss: 73Mb L: 10/10 MS: 1 PersAutoDict- DE: "\000\000"- 00:08:08.583 [2024-11-18 14:21:04.549459] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000b50a cdw11:00000000 00:08:08.583 [2024-11-18 14:21:04.549487] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.583 [2024-11-18 14:21:04.549611] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:08.583 [2024-11-18 14:21:04.549629] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.583 #28 NEW cov: 12404 ft: 14738 corp: 25/117b lim: 10 exec/s: 28 rss: 73Mb L: 5/10 MS: 1 CrossOver- 00:08:08.583 [2024-11-18 14:21:04.599430] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:08.583 [2024-11-18 14:21:04.599456] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.583 #29 NEW cov: 12404 ft: 14767 corp: 26/119b lim: 10 exec/s: 29 rss: 73Mb L: 2/10 MS: 1 EraseBytes- 00:08:08.583 [2024-11-18 14:21:04.650267] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:000021b5 cdw11:00000000 00:08:08.583 [2024-11-18 14:21:04.650293] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.583 [2024-11-18 14:21:04.650411] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000ab5 cdw11:00000000 00:08:08.583 [2024-11-18 14:21:04.650428] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.584 [2024-11-18 14:21:04.650556] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000bc0a cdw11:00000000 00:08:08.584 [2024-11-18 14:21:04.650573] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.584 [2024-11-18 14:21:04.650692] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000bcbc cdw11:00000000 00:08:08.584 [2024-11-18 14:21:04.650708] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:08.584 #30 NEW cov: 12404 ft: 14822 corp: 27/128b lim: 10 exec/s: 30 rss: 73Mb L: 9/10 MS: 1 CrossOver- 00:08:08.844 [2024-11-18 14:21:04.719848] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000233a cdw11:00000000 00:08:08.844 [2024-11-18 14:21:04.719876] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.844 #31 NEW cov: 12404 ft: 14829 corp: 28/130b lim: 10 exec/s: 31 rss: 73Mb L: 2/10 MS: 1 ChangeByte- 00:08:08.844 [2024-11-18 14:21:04.770722] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a40 cdw11:00000000 00:08:08.844 [2024-11-18 14:21:04.770751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.844 [2024-11-18 14:21:04.770874] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00003613 cdw11:00000000 00:08:08.844 [2024-11-18 14:21:04.770893] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.844 [2024-11-18 14:21:04.771017] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00007ba9 cdw11:00000000 00:08:08.844 [2024-11-18 14:21:04.771036] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.844 [2024-11-18 14:21:04.771155] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00008b8b cdw11:00000000 00:08:08.844 [2024-11-18 14:21:04.771173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:08.844 #32 NEW cov: 12404 ft: 14908 corp: 29/139b lim: 10 exec/s: 32 rss: 73Mb L: 9/10 MS: 1 CMP- DE: "@6\023{\251\213\213\000"- 00:08:08.844 [2024-11-18 14:21:04.820390] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:000021b5 cdw11:00000000 00:08:08.844 [2024-11-18 14:21:04.820418] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.844 [2024-11-18 14:21:04.820537] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00004a0a cdw11:00000000 00:08:08.844 [2024-11-18 14:21:04.820562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.844 #33 NEW cov: 12404 ft: 14912 corp: 30/143b lim: 10 exec/s: 16 rss: 73Mb L: 4/10 MS: 1 ChangeBit- 00:08:08.844 #33 DONE cov: 12404 ft: 14912 corp: 30/143b lim: 10 exec/s: 16 rss: 73Mb 00:08:08.844 ###### Recommended dictionary. ###### 00:08:08.844 "\000\000" # Uses: 2 00:08:08.844 "@6\023{\251\213\213\000" # Uses: 0 00:08:08.844 ###### End of recommended dictionary. ###### 00:08:08.844 Done 33 runs in 2 second(s) 00:08:08.844 14:21:04 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_6.conf /var/tmp/suppress_nvmf_fuzz 00:08:08.844 14:21:04 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:08.844 14:21:04 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:08.844 14:21:04 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 7 1 0x1 00:08:08.844 14:21:04 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=7 00:08:08.844 14:21:04 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:08.844 14:21:04 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:08.844 14:21:04 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:08:08.845 14:21:04 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_7.conf 00:08:08.845 14:21:04 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:08.845 14:21:04 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:08.845 14:21:04 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 7 00:08:08.845 14:21:04 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4407 00:08:08.845 14:21:04 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:08:08.845 14:21:04 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4407' 00:08:08.845 14:21:04 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4407"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:08.845 14:21:04 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:08.845 14:21:04 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:08.845 14:21:04 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4407' -c /tmp/fuzz_json_7.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 -Z 7 00:08:09.105 [2024-11-18 14:21:04.991584] Starting SPDK v25.01-pre git sha1 d47eb51c9 / DPDK 23.11.0 initialization... 00:08:09.105 [2024-11-18 14:21:04.991648] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid325460 ] 00:08:09.105 [2024-11-18 14:21:05.198709] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:09.105 [2024-11-18 14:21:05.211423] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:09.365 [2024-11-18 14:21:05.264058] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:09.365 [2024-11-18 14:21:05.280385] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4407 *** 00:08:09.365 INFO: Running with entropic power schedule (0xFF, 100). 00:08:09.365 INFO: Seed: 3650769451 00:08:09.365 INFO: Loaded 1 modules (387495 inline 8-bit counters): 387495 [0x2acbe4c, 0x2b2a7f3), 00:08:09.365 INFO: Loaded 1 PC tables (387495 PCs): 387495 [0x2b2a7f8,0x3114268), 00:08:09.365 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:08:09.365 INFO: A corpus is not provided, starting from an empty corpus 00:08:09.365 #2 INITED exec/s: 0 rss: 63Mb 00:08:09.365 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:09.365 This may also happen if the target rejected all inputs we tried so far 00:08:09.365 [2024-11-18 14:21:05.347444] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00007272 cdw11:00000000 00:08:09.365 [2024-11-18 14:21:05.347480] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.365 [2024-11-18 14:21:05.347614] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00007272 cdw11:00000000 00:08:09.365 [2024-11-18 14:21:05.347634] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.365 [2024-11-18 14:21:05.347759] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00007272 cdw11:00000000 00:08:09.365 [2024-11-18 14:21:05.347778] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.365 [2024-11-18 14:21:05.347895] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000728f cdw11:00000000 00:08:09.365 [2024-11-18 14:21:05.347912] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:09.625 NEW_FUNC[1/713]: 0x464b88 in fuzz_admin_delete_io_submission_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:172 00:08:09.625 NEW_FUNC[2/713]: 0x495e28 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:09.625 #5 NEW cov: 12172 ft: 12173 corp: 2/9b lim: 10 exec/s: 0 rss: 71Mb L: 8/8 MS: 3 ChangeByte-ChangeByte-InsertRepeatedBytes- 00:08:09.625 [2024-11-18 14:21:05.697972] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000cafc cdw11:00000000 00:08:09.625 [2024-11-18 14:21:05.698022] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.625 [2024-11-18 14:21:05.698154] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000fcfc cdw11:00000000 00:08:09.625 [2024-11-18 14:21:05.698178] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.625 NEW_FUNC[1/1]: 0x17f89b8 in nvme_ctrlr_get_ready_timeout /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_ctrlr.c:1292 00:08:09.625 #10 NEW cov: 12290 ft: 13138 corp: 3/13b lim: 10 exec/s: 0 rss: 71Mb L: 4/8 MS: 5 ShuffleBytes-ChangeByte-ChangeBit-ShuffleBytes-InsertRepeatedBytes- 00:08:09.885 [2024-11-18 14:21:05.758432] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00007272 cdw11:00000000 00:08:09.885 [2024-11-18 14:21:05.758465] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.885 [2024-11-18 14:21:05.758594] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00007272 cdw11:00000000 00:08:09.885 [2024-11-18 14:21:05.758613] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.885 [2024-11-18 14:21:05.758721] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00007276 cdw11:00000000 00:08:09.885 [2024-11-18 14:21:05.758740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.885 [2024-11-18 14:21:05.758853] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000728f cdw11:00000000 00:08:09.885 [2024-11-18 14:21:05.758868] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:09.885 #11 NEW cov: 12296 ft: 13470 corp: 4/21b lim: 10 exec/s: 0 rss: 71Mb L: 8/8 MS: 1 ChangeBit- 00:08:09.885 [2024-11-18 14:21:05.828484] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000cafc cdw11:00000000 00:08:09.885 [2024-11-18 14:21:05.828511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.885 [2024-11-18 14:21:05.828644] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:09.885 [2024-11-18 14:21:05.828663] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.886 [2024-11-18 14:21:05.828776] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:09.886 [2024-11-18 14:21:05.828792] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.886 [2024-11-18 14:21:05.828900] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000fcfc cdw11:00000000 00:08:09.886 [2024-11-18 14:21:05.828917] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:09.886 #12 NEW cov: 12381 ft: 13714 corp: 5/29b lim: 10 exec/s: 0 rss: 71Mb L: 8/8 MS: 1 InsertRepeatedBytes- 00:08:09.886 [2024-11-18 14:21:05.898340] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00003172 cdw11:00000000 00:08:09.886 [2024-11-18 14:21:05.898369] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.886 [2024-11-18 14:21:05.898485] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00007272 cdw11:00000000 00:08:09.886 [2024-11-18 14:21:05.898502] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.886 #16 NEW cov: 12381 ft: 13842 corp: 6/33b lim: 10 exec/s: 0 rss: 71Mb L: 4/8 MS: 4 CopyPart-ChangeByte-ShuffleBytes-CrossOver- 00:08:09.886 [2024-11-18 14:21:05.948895] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000cafc cdw11:00000000 00:08:09.886 [2024-11-18 14:21:05.948922] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.886 [2024-11-18 14:21:05.949040] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000fffc cdw11:00000000 00:08:09.886 [2024-11-18 14:21:05.949055] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.886 [2024-11-18 14:21:05.949166] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:09.886 [2024-11-18 14:21:05.949183] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.886 [2024-11-18 14:21:05.949287] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000fcfc cdw11:00000000 00:08:09.886 [2024-11-18 14:21:05.949304] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:09.886 #17 NEW cov: 12381 ft: 13938 corp: 7/41b lim: 10 exec/s: 0 rss: 71Mb L: 8/8 MS: 1 ChangeBinInt- 00:08:10.146 [2024-11-18 14:21:06.019090] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:10.146 [2024-11-18 14:21:06.019120] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.146 [2024-11-18 14:21:06.019242] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:10.146 [2024-11-18 14:21:06.019259] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.146 [2024-11-18 14:21:06.019371] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:10.146 [2024-11-18 14:21:06.019387] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.146 [2024-11-18 14:21:06.019503] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:10.146 [2024-11-18 14:21:06.019518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.146 #18 NEW cov: 12381 ft: 13995 corp: 8/50b lim: 10 exec/s: 0 rss: 71Mb L: 9/9 MS: 1 InsertRepeatedBytes- 00:08:10.146 [2024-11-18 14:21:06.068947] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00007272 cdw11:00000000 00:08:10.146 [2024-11-18 14:21:06.068972] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.146 [2024-11-18 14:21:06.069097] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00007272 cdw11:00000000 00:08:10.146 [2024-11-18 14:21:06.069113] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.146 [2024-11-18 14:21:06.069228] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000728f cdw11:00000000 00:08:10.146 [2024-11-18 14:21:06.069245] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.146 #19 NEW cov: 12381 ft: 14167 corp: 9/56b lim: 10 exec/s: 0 rss: 71Mb L: 6/9 MS: 1 EraseBytes- 00:08:10.146 [2024-11-18 14:21:06.119400] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00007272 cdw11:00000000 00:08:10.146 [2024-11-18 14:21:06.119427] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.146 [2024-11-18 14:21:06.119541] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00007272 cdw11:00000000 00:08:10.146 [2024-11-18 14:21:06.119563] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.146 [2024-11-18 14:21:06.119668] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00007276 cdw11:00000000 00:08:10.146 [2024-11-18 14:21:06.119683] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.146 [2024-11-18 14:21:06.119805] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000728f cdw11:00000000 00:08:10.146 [2024-11-18 14:21:06.119822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.146 #20 NEW cov: 12381 ft: 14201 corp: 10/64b lim: 10 exec/s: 0 rss: 72Mb L: 8/9 MS: 1 ShuffleBytes- 00:08:10.146 [2024-11-18 14:21:06.189294] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000311e cdw11:00000000 00:08:10.146 [2024-11-18 14:21:06.189320] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.146 [2024-11-18 14:21:06.189444] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000072 cdw11:00000000 00:08:10.146 [2024-11-18 14:21:06.189461] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.146 [2024-11-18 14:21:06.189584] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00007272 cdw11:00000000 00:08:10.146 [2024-11-18 14:21:06.189600] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.146 NEW_FUNC[1/1]: 0x1c52b68 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:10.146 #21 NEW cov: 12404 ft: 14304 corp: 11/70b lim: 10 exec/s: 0 rss: 72Mb L: 6/9 MS: 1 CMP- DE: "\036\000"- 00:08:10.146 [2024-11-18 14:21:06.259722] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000311e cdw11:00000000 00:08:10.146 [2024-11-18 14:21:06.259748] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.146 [2024-11-18 14:21:06.259869] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000072 cdw11:00000000 00:08:10.146 [2024-11-18 14:21:06.259885] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.146 [2024-11-18 14:21:06.260001] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00007272 cdw11:00000000 00:08:10.146 [2024-11-18 14:21:06.260016] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.146 [2024-11-18 14:21:06.260131] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00007272 cdw11:00000000 00:08:10.146 [2024-11-18 14:21:06.260146] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.406 #22 NEW cov: 12404 ft: 14326 corp: 12/79b lim: 10 exec/s: 0 rss: 72Mb L: 9/9 MS: 1 CrossOver- 00:08:10.406 [2024-11-18 14:21:06.329950] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000fc72 cdw11:00000000 00:08:10.406 [2024-11-18 14:21:06.329976] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.406 [2024-11-18 14:21:06.330089] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00007272 cdw11:00000000 00:08:10.406 [2024-11-18 14:21:06.330104] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.406 [2024-11-18 14:21:06.330221] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00007272 cdw11:00000000 00:08:10.406 [2024-11-18 14:21:06.330237] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.406 [2024-11-18 14:21:06.330359] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00007672 cdw11:00000000 00:08:10.406 [2024-11-18 14:21:06.330378] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.406 #23 NEW cov: 12404 ft: 14373 corp: 13/88b lim: 10 exec/s: 23 rss: 72Mb L: 9/9 MS: 1 CrossOver- 00:08:10.406 [2024-11-18 14:21:06.379558] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000fcfc cdw11:00000000 00:08:10.406 [2024-11-18 14:21:06.379586] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.406 #24 NEW cov: 12404 ft: 14577 corp: 14/90b lim: 10 exec/s: 24 rss: 72Mb L: 2/9 MS: 1 EraseBytes- 00:08:10.406 [2024-11-18 14:21:06.430250] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:10.406 [2024-11-18 14:21:06.430278] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.406 [2024-11-18 14:21:06.430398] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:10.407 [2024-11-18 14:21:06.430418] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.407 [2024-11-18 14:21:06.430542] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ffd8 cdw11:00000000 00:08:10.407 [2024-11-18 14:21:06.430564] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.407 [2024-11-18 14:21:06.430677] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:10.407 [2024-11-18 14:21:06.430693] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.407 #25 NEW cov: 12404 ft: 14602 corp: 15/99b lim: 10 exec/s: 25 rss: 72Mb L: 9/9 MS: 1 ChangeByte- 00:08:10.407 [2024-11-18 14:21:06.500500] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000cafc cdw11:00000000 00:08:10.407 [2024-11-18 14:21:06.500528] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.407 [2024-11-18 14:21:06.500647] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000fffc cdw11:00000000 00:08:10.407 [2024-11-18 14:21:06.500664] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.407 [2024-11-18 14:21:06.500780] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:10.407 [2024-11-18 14:21:06.500797] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.407 [2024-11-18 14:21:06.500908] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000fffc cdw11:00000000 00:08:10.407 [2024-11-18 14:21:06.500924] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.667 #26 NEW cov: 12404 ft: 14632 corp: 16/108b lim: 10 exec/s: 26 rss: 72Mb L: 9/9 MS: 1 InsertByte- 00:08:10.667 [2024-11-18 14:21:06.570611] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000cafc cdw11:00000000 00:08:10.667 [2024-11-18 14:21:06.570638] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.667 [2024-11-18 14:21:06.570754] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000fffc cdw11:00000000 00:08:10.667 [2024-11-18 14:21:06.570771] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.667 [2024-11-18 14:21:06.570881] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:000072ff cdw11:00000000 00:08:10.667 [2024-11-18 14:21:06.570898] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.667 [2024-11-18 14:21:06.571015] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000fffc cdw11:00000000 00:08:10.667 [2024-11-18 14:21:06.571031] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.667 #27 NEW cov: 12404 ft: 14645 corp: 17/117b lim: 10 exec/s: 27 rss: 72Mb L: 9/9 MS: 1 CrossOver- 00:08:10.667 [2024-11-18 14:21:06.620625] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000ca1e cdw11:00000000 00:08:10.667 [2024-11-18 14:21:06.620652] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.667 [2024-11-18 14:21:06.620766] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:000000fc cdw11:00000000 00:08:10.667 [2024-11-18 14:21:06.620783] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.667 [2024-11-18 14:21:06.620906] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000fcfc cdw11:00000000 00:08:10.667 [2024-11-18 14:21:06.620923] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.667 #28 NEW cov: 12404 ft: 14651 corp: 18/123b lim: 10 exec/s: 28 rss: 72Mb L: 6/9 MS: 1 PersAutoDict- DE: "\036\000"- 00:08:10.667 [2024-11-18 14:21:06.670796] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00007272 cdw11:00000000 00:08:10.667 [2024-11-18 14:21:06.670824] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.667 [2024-11-18 14:21:06.670936] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000fcfc cdw11:00000000 00:08:10.667 [2024-11-18 14:21:06.670955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.667 [2024-11-18 14:21:06.671058] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000fc8f cdw11:00000000 00:08:10.667 [2024-11-18 14:21:06.671074] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.667 #29 NEW cov: 12404 ft: 14670 corp: 19/129b lim: 10 exec/s: 29 rss: 72Mb L: 6/9 MS: 1 CrossOver- 00:08:10.667 [2024-11-18 14:21:06.741135] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000311e cdw11:00000000 00:08:10.667 [2024-11-18 14:21:06.741163] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.667 [2024-11-18 14:21:06.741278] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000072 cdw11:00000000 00:08:10.667 [2024-11-18 14:21:06.741296] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.667 [2024-11-18 14:21:06.741418] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00008a8d cdw11:00000000 00:08:10.667 [2024-11-18 14:21:06.741433] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.667 #30 NEW cov: 12404 ft: 14738 corp: 20/135b lim: 10 exec/s: 30 rss: 72Mb L: 6/9 MS: 1 ChangeBinInt- 00:08:10.667 [2024-11-18 14:21:06.790741] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000728a cdw11:00000000 00:08:10.667 [2024-11-18 14:21:06.790768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.928 #31 NEW cov: 12404 ft: 14799 corp: 21/138b lim: 10 exec/s: 31 rss: 72Mb L: 3/9 MS: 1 EraseBytes- 00:08:10.928 [2024-11-18 14:21:06.861597] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000cafc cdw11:00000000 00:08:10.928 [2024-11-18 14:21:06.861624] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.928 [2024-11-18 14:21:06.861753] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000fffc cdw11:00000000 00:08:10.928 [2024-11-18 14:21:06.861770] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.928 [2024-11-18 14:21:06.861884] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00007200 cdw11:00000000 00:08:10.928 [2024-11-18 14:21:06.861899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.928 [2024-11-18 14:21:06.862029] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:000000fc cdw11:00000000 00:08:10.928 [2024-11-18 14:21:06.862046] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.928 #32 NEW cov: 12404 ft: 14808 corp: 22/147b lim: 10 exec/s: 32 rss: 72Mb L: 9/9 MS: 1 CMP- DE: "\000\000"- 00:08:10.928 [2024-11-18 14:21:06.931874] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00007272 cdw11:00000000 00:08:10.928 [2024-11-18 14:21:06.931902] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.928 [2024-11-18 14:21:06.932027] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00007272 cdw11:00000000 00:08:10.928 [2024-11-18 14:21:06.932045] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.928 [2024-11-18 14:21:06.932156] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00007272 cdw11:00000000 00:08:10.928 [2024-11-18 14:21:06.932173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.928 [2024-11-18 14:21:06.932287] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000728f cdw11:00000000 00:08:10.928 [2024-11-18 14:21:06.932304] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.928 #33 NEW cov: 12404 ft: 14830 corp: 23/155b lim: 10 exec/s: 33 rss: 72Mb L: 8/9 MS: 1 CopyPart- 00:08:10.928 [2024-11-18 14:21:06.981955] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00003737 cdw11:00000000 00:08:10.928 [2024-11-18 14:21:06.981984] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.928 [2024-11-18 14:21:06.982100] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:000037ca cdw11:00000000 00:08:10.928 [2024-11-18 14:21:06.982117] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.928 [2024-11-18 14:21:06.982236] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00001e00 cdw11:00000000 00:08:10.928 [2024-11-18 14:21:06.982253] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.928 [2024-11-18 14:21:06.982372] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000fcfc cdw11:00000000 00:08:10.928 [2024-11-18 14:21:06.982389] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.928 #34 NEW cov: 12404 ft: 14846 corp: 24/164b lim: 10 exec/s: 34 rss: 72Mb L: 9/9 MS: 1 InsertRepeatedBytes- 00:08:10.928 [2024-11-18 14:21:07.052375] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000cafc cdw11:00000000 00:08:10.928 [2024-11-18 14:21:07.052404] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.928 [2024-11-18 14:21:07.052523] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000fffc cdw11:00000000 00:08:10.928 [2024-11-18 14:21:07.052538] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.928 [2024-11-18 14:21:07.052660] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:000072ff cdw11:00000000 00:08:10.928 [2024-11-18 14:21:07.052678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.928 [2024-11-18 14:21:07.052788] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000fffc cdw11:00000000 00:08:10.928 [2024-11-18 14:21:07.052805] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.928 [2024-11-18 14:21:07.052925] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:0000fcfc cdw11:00000000 00:08:10.928 [2024-11-18 14:21:07.052943] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:11.188 #35 NEW cov: 12404 ft: 14921 corp: 25/174b lim: 10 exec/s: 35 rss: 72Mb L: 10/10 MS: 1 CopyPart- 00:08:11.188 [2024-11-18 14:21:07.101691] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000738a cdw11:00000000 00:08:11.188 [2024-11-18 14:21:07.101719] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.188 #36 NEW cov: 12404 ft: 14936 corp: 26/177b lim: 10 exec/s: 36 rss: 73Mb L: 3/10 MS: 1 ChangeBit- 00:08:11.188 [2024-11-18 14:21:07.172515] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00007272 cdw11:00000000 00:08:11.188 [2024-11-18 14:21:07.172541] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.188 [2024-11-18 14:21:07.172664] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00007272 cdw11:00000000 00:08:11.188 [2024-11-18 14:21:07.172682] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.188 [2024-11-18 14:21:07.172799] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00007272 cdw11:00000000 00:08:11.188 [2024-11-18 14:21:07.172817] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.188 [2024-11-18 14:21:07.172937] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000c98f cdw11:00000000 00:08:11.188 [2024-11-18 14:21:07.172955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.188 #37 NEW cov: 12404 ft: 14948 corp: 27/185b lim: 10 exec/s: 37 rss: 73Mb L: 8/10 MS: 1 ChangeByte- 00:08:11.188 [2024-11-18 14:21:07.242781] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00007272 cdw11:00000000 00:08:11.188 [2024-11-18 14:21:07.242810] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.188 [2024-11-18 14:21:07.242933] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00007272 cdw11:00000000 00:08:11.188 [2024-11-18 14:21:07.242951] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.188 [2024-11-18 14:21:07.243065] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00008f72 cdw11:00000000 00:08:11.188 [2024-11-18 14:21:07.243082] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.188 [2024-11-18 14:21:07.243203] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00007272 cdw11:00000000 00:08:11.188 [2024-11-18 14:21:07.243218] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.188 #38 NEW cov: 12404 ft: 14968 corp: 28/194b lim: 10 exec/s: 38 rss: 73Mb L: 9/10 MS: 1 CopyPart- 00:08:11.188 [2024-11-18 14:21:07.292957] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000cafc cdw11:00000000 00:08:11.188 [2024-11-18 14:21:07.292985] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.188 [2024-11-18 14:21:07.293097] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:000000ff cdw11:00000000 00:08:11.188 [2024-11-18 14:21:07.293113] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.188 [2024-11-18 14:21:07.293242] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000fc00 cdw11:00000000 00:08:11.189 [2024-11-18 14:21:07.293261] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.189 [2024-11-18 14:21:07.293376] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000fc72 cdw11:00000000 00:08:11.189 [2024-11-18 14:21:07.293392] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.449 #39 NEW cov: 12404 ft: 14982 corp: 29/203b lim: 10 exec/s: 19 rss: 73Mb L: 9/10 MS: 1 ShuffleBytes- 00:08:11.449 #39 DONE cov: 12404 ft: 14982 corp: 29/203b lim: 10 exec/s: 19 rss: 73Mb 00:08:11.449 ###### Recommended dictionary. ###### 00:08:11.449 "\036\000" # Uses: 1 00:08:11.449 "\000\000" # Uses: 0 00:08:11.449 ###### End of recommended dictionary. ###### 00:08:11.449 Done 39 runs in 2 second(s) 00:08:11.449 14:21:07 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_7.conf /var/tmp/suppress_nvmf_fuzz 00:08:11.449 14:21:07 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:11.449 14:21:07 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:11.449 14:21:07 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 8 1 0x1 00:08:11.449 14:21:07 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=8 00:08:11.449 14:21:07 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:11.449 14:21:07 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:11.449 14:21:07 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:08:11.449 14:21:07 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_8.conf 00:08:11.449 14:21:07 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:11.449 14:21:07 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:11.449 14:21:07 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 8 00:08:11.449 14:21:07 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4408 00:08:11.449 14:21:07 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:08:11.449 14:21:07 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4408' 00:08:11.449 14:21:07 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4408"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:11.449 14:21:07 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:11.449 14:21:07 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:11.449 14:21:07 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4408' -c /tmp/fuzz_json_8.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 -Z 8 00:08:11.449 [2024-11-18 14:21:07.476870] Starting SPDK v25.01-pre git sha1 d47eb51c9 / DPDK 23.11.0 initialization... 00:08:11.449 [2024-11-18 14:21:07.476946] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid326121 ] 00:08:11.709 [2024-11-18 14:21:07.675440] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:11.709 [2024-11-18 14:21:07.687767] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:11.709 [2024-11-18 14:21:07.740088] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:11.709 [2024-11-18 14:21:07.756389] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4408 *** 00:08:11.709 INFO: Running with entropic power schedule (0xFF, 100). 00:08:11.709 INFO: Seed: 1832802798 00:08:11.709 INFO: Loaded 1 modules (387495 inline 8-bit counters): 387495 [0x2acbe4c, 0x2b2a7f3), 00:08:11.709 INFO: Loaded 1 PC tables (387495 PCs): 387495 [0x2b2a7f8,0x3114268), 00:08:11.709 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:08:11.709 INFO: A corpus is not provided, starting from an empty corpus 00:08:11.709 [2024-11-18 14:21:07.811873] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.709 [2024-11-18 14:21:07.811908] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.709 #2 INITED cov: 12205 ft: 12204 corp: 1/1b exec/s: 0 rss: 71Mb 00:08:11.969 [2024-11-18 14:21:07.851796] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.969 [2024-11-18 14:21:07.851821] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.969 #3 NEW cov: 12318 ft: 12857 corp: 2/2b lim: 5 exec/s: 0 rss: 71Mb L: 1/1 MS: 1 ChangeByte- 00:08:11.969 [2024-11-18 14:21:07.911985] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.969 [2024-11-18 14:21:07.912009] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.969 #4 NEW cov: 12324 ft: 13124 corp: 3/3b lim: 5 exec/s: 0 rss: 71Mb L: 1/1 MS: 1 ChangeByte- 00:08:11.969 [2024-11-18 14:21:07.972138] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.969 [2024-11-18 14:21:07.972162] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.969 #5 NEW cov: 12409 ft: 13409 corp: 4/4b lim: 5 exec/s: 0 rss: 71Mb L: 1/1 MS: 1 CrossOver- 00:08:11.969 [2024-11-18 14:21:08.012249] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.969 [2024-11-18 14:21:08.012274] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.969 #6 NEW cov: 12409 ft: 13483 corp: 5/5b lim: 5 exec/s: 0 rss: 71Mb L: 1/1 MS: 1 ChangeBit- 00:08:11.969 [2024-11-18 14:21:08.072407] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.969 [2024-11-18 14:21:08.072432] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.229 #7 NEW cov: 12409 ft: 13549 corp: 6/6b lim: 5 exec/s: 0 rss: 72Mb L: 1/1 MS: 1 ChangeByte- 00:08:12.229 [2024-11-18 14:21:08.132602] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.229 [2024-11-18 14:21:08.132626] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.229 #8 NEW cov: 12409 ft: 13626 corp: 7/7b lim: 5 exec/s: 0 rss: 72Mb L: 1/1 MS: 1 ChangeBit- 00:08:12.229 [2024-11-18 14:21:08.172709] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.229 [2024-11-18 14:21:08.172733] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.229 #9 NEW cov: 12409 ft: 13661 corp: 8/8b lim: 5 exec/s: 0 rss: 72Mb L: 1/1 MS: 1 CopyPart- 00:08:12.229 [2024-11-18 14:21:08.213272] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.229 [2024-11-18 14:21:08.213299] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.229 [2024-11-18 14:21:08.213370] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.229 [2024-11-18 14:21:08.213384] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.229 [2024-11-18 14:21:08.213436] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.229 [2024-11-18 14:21:08.213450] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.229 [2024-11-18 14:21:08.213503] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.229 [2024-11-18 14:21:08.213517] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.229 #10 NEW cov: 12409 ft: 14477 corp: 9/12b lim: 5 exec/s: 0 rss: 72Mb L: 4/4 MS: 1 InsertRepeatedBytes- 00:08:12.229 [2024-11-18 14:21:08.272981] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.229 [2024-11-18 14:21:08.273006] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.230 #11 NEW cov: 12409 ft: 14508 corp: 10/13b lim: 5 exec/s: 0 rss: 72Mb L: 1/4 MS: 1 ChangeBit- 00:08:12.230 [2024-11-18 14:21:08.313263] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.230 [2024-11-18 14:21:08.313288] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.230 [2024-11-18 14:21:08.313342] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.230 [2024-11-18 14:21:08.313356] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.230 #12 NEW cov: 12409 ft: 14713 corp: 11/15b lim: 5 exec/s: 0 rss: 72Mb L: 2/4 MS: 1 InsertByte- 00:08:12.490 [2024-11-18 14:21:08.373263] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.490 [2024-11-18 14:21:08.373289] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.490 #13 NEW cov: 12409 ft: 14837 corp: 12/16b lim: 5 exec/s: 0 rss: 72Mb L: 1/4 MS: 1 ShuffleBytes- 00:08:12.490 [2024-11-18 14:21:08.433813] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.490 [2024-11-18 14:21:08.433839] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.490 [2024-11-18 14:21:08.433893] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.490 [2024-11-18 14:21:08.433907] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.490 [2024-11-18 14:21:08.433962] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.490 [2024-11-18 14:21:08.433975] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.490 #14 NEW cov: 12409 ft: 15029 corp: 13/19b lim: 5 exec/s: 0 rss: 72Mb L: 3/4 MS: 1 CopyPart- 00:08:12.490 [2024-11-18 14:21:08.493645] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.490 [2024-11-18 14:21:08.493677] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.490 #15 NEW cov: 12409 ft: 15038 corp: 14/20b lim: 5 exec/s: 0 rss: 72Mb L: 1/4 MS: 1 CopyPart- 00:08:12.490 [2024-11-18 14:21:08.533745] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.490 [2024-11-18 14:21:08.533770] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.490 #16 NEW cov: 12409 ft: 15059 corp: 15/21b lim: 5 exec/s: 0 rss: 72Mb L: 1/4 MS: 1 ChangeBit- 00:08:12.490 [2024-11-18 14:21:08.574165] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.490 [2024-11-18 14:21:08.574190] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.490 [2024-11-18 14:21:08.574243] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.490 [2024-11-18 14:21:08.574257] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.490 [2024-11-18 14:21:08.574308] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.490 [2024-11-18 14:21:08.574321] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.490 #17 NEW cov: 12409 ft: 15068 corp: 16/24b lim: 5 exec/s: 0 rss: 72Mb L: 3/4 MS: 1 ChangeByte- 00:08:12.749 [2024-11-18 14:21:08.633984] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.749 [2024-11-18 14:21:08.634009] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.749 #18 NEW cov: 12409 ft: 15086 corp: 17/25b lim: 5 exec/s: 0 rss: 72Mb L: 1/4 MS: 1 ChangeByte- 00:08:12.749 [2024-11-18 14:21:08.694176] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.749 [2024-11-18 14:21:08.694201] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.009 NEW_FUNC[1/1]: 0x1c52b68 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:13.010 #19 NEW cov: 12432 ft: 15126 corp: 18/26b lim: 5 exec/s: 19 rss: 74Mb L: 1/4 MS: 1 ShuffleBytes- 00:08:13.010 [2024-11-18 14:21:09.026279] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.010 [2024-11-18 14:21:09.026329] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.010 #20 NEW cov: 12432 ft: 15371 corp: 19/27b lim: 5 exec/s: 20 rss: 74Mb L: 1/4 MS: 1 ChangeBit- 00:08:13.010 [2024-11-18 14:21:09.076247] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.010 [2024-11-18 14:21:09.076275] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.010 #21 NEW cov: 12432 ft: 15451 corp: 20/28b lim: 5 exec/s: 21 rss: 74Mb L: 1/4 MS: 1 CopyPart- 00:08:13.010 [2024-11-18 14:21:09.127552] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.010 [2024-11-18 14:21:09.127578] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.010 [2024-11-18 14:21:09.127712] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.010 [2024-11-18 14:21:09.127731] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.010 [2024-11-18 14:21:09.127868] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.010 [2024-11-18 14:21:09.127886] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:13.010 [2024-11-18 14:21:09.128034] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.010 [2024-11-18 14:21:09.128051] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:13.010 [2024-11-18 14:21:09.128186] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.010 [2024-11-18 14:21:09.128204] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:13.272 #22 NEW cov: 12432 ft: 15508 corp: 21/33b lim: 5 exec/s: 22 rss: 74Mb L: 5/5 MS: 1 CMP- DE: "\001\000\000\000"- 00:08:13.272 [2024-11-18 14:21:09.197839] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.272 [2024-11-18 14:21:09.197865] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.272 [2024-11-18 14:21:09.198002] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.272 [2024-11-18 14:21:09.198020] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.272 [2024-11-18 14:21:09.198163] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.272 [2024-11-18 14:21:09.198180] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:13.272 [2024-11-18 14:21:09.198319] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.272 [2024-11-18 14:21:09.198336] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:13.272 [2024-11-18 14:21:09.198472] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.272 [2024-11-18 14:21:09.198493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:13.272 #23 NEW cov: 12432 ft: 15526 corp: 22/38b lim: 5 exec/s: 23 rss: 74Mb L: 5/5 MS: 1 PersAutoDict- DE: "\001\000\000\000"- 00:08:13.272 [2024-11-18 14:21:09.247095] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.272 [2024-11-18 14:21:09.247123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.272 [2024-11-18 14:21:09.247277] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.272 [2024-11-18 14:21:09.247295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.272 #24 NEW cov: 12432 ft: 15548 corp: 23/40b lim: 5 exec/s: 24 rss: 74Mb L: 2/5 MS: 1 CopyPart- 00:08:13.272 [2024-11-18 14:21:09.297042] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.272 [2024-11-18 14:21:09.297070] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.272 #25 NEW cov: 12432 ft: 15565 corp: 24/41b lim: 5 exec/s: 25 rss: 74Mb L: 1/5 MS: 1 CopyPart- 00:08:13.272 [2024-11-18 14:21:09.347541] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.272 [2024-11-18 14:21:09.347573] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.272 [2024-11-18 14:21:09.347725] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.272 [2024-11-18 14:21:09.347742] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.272 #26 NEW cov: 12432 ft: 15583 corp: 25/43b lim: 5 exec/s: 26 rss: 74Mb L: 2/5 MS: 1 CopyPart- 00:08:13.272 [2024-11-18 14:21:09.398261] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.272 [2024-11-18 14:21:09.398289] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.272 [2024-11-18 14:21:09.398441] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.272 [2024-11-18 14:21:09.398457] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.272 [2024-11-18 14:21:09.398596] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.272 [2024-11-18 14:21:09.398613] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:13.272 [2024-11-18 14:21:09.398757] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.272 [2024-11-18 14:21:09.398775] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:13.532 #27 NEW cov: 12432 ft: 15659 corp: 26/47b lim: 5 exec/s: 27 rss: 74Mb L: 4/5 MS: 1 InsertRepeatedBytes- 00:08:13.532 [2024-11-18 14:21:09.467633] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.532 [2024-11-18 14:21:09.467660] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.532 #28 NEW cov: 12432 ft: 15668 corp: 27/48b lim: 5 exec/s: 28 rss: 74Mb L: 1/5 MS: 1 CopyPart- 00:08:13.532 [2024-11-18 14:21:09.518075] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.532 [2024-11-18 14:21:09.518105] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.532 [2024-11-18 14:21:09.518256] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.532 [2024-11-18 14:21:09.518276] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.532 #29 NEW cov: 12432 ft: 15751 corp: 28/50b lim: 5 exec/s: 29 rss: 74Mb L: 2/5 MS: 1 ChangeBinInt- 00:08:13.532 [2024-11-18 14:21:09.587937] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.532 [2024-11-18 14:21:09.587966] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.532 #30 NEW cov: 12432 ft: 15763 corp: 29/51b lim: 5 exec/s: 30 rss: 74Mb L: 1/5 MS: 1 ChangeByte- 00:08:13.532 [2024-11-18 14:21:09.638114] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.532 [2024-11-18 14:21:09.638143] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.532 #31 NEW cov: 12432 ft: 15778 corp: 30/52b lim: 5 exec/s: 31 rss: 74Mb L: 1/5 MS: 1 ChangeBit- 00:08:13.792 [2024-11-18 14:21:09.688644] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.792 [2024-11-18 14:21:09.688671] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.792 [2024-11-18 14:21:09.688823] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.792 [2024-11-18 14:21:09.688841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.792 #32 NEW cov: 12432 ft: 15792 corp: 31/54b lim: 5 exec/s: 32 rss: 74Mb L: 2/5 MS: 1 InsertByte- 00:08:13.792 [2024-11-18 14:21:09.738964] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.792 [2024-11-18 14:21:09.738992] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.792 [2024-11-18 14:21:09.739145] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.792 [2024-11-18 14:21:09.739165] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.792 #33 NEW cov: 12432 ft: 15813 corp: 32/56b lim: 5 exec/s: 33 rss: 74Mb L: 2/5 MS: 1 InsertByte- 00:08:13.792 [2024-11-18 14:21:09.808713] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.792 [2024-11-18 14:21:09.808741] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.792 #34 NEW cov: 12432 ft: 15823 corp: 33/57b lim: 5 exec/s: 17 rss: 74Mb L: 1/5 MS: 1 ChangeByte- 00:08:13.793 #34 DONE cov: 12432 ft: 15823 corp: 33/57b lim: 5 exec/s: 17 rss: 74Mb 00:08:13.793 ###### Recommended dictionary. ###### 00:08:13.793 "\001\000\000\000" # Uses: 1 00:08:13.793 ###### End of recommended dictionary. ###### 00:08:13.793 Done 34 runs in 2 second(s) 00:08:14.053 14:21:09 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_8.conf /var/tmp/suppress_nvmf_fuzz 00:08:14.053 14:21:09 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:14.053 14:21:09 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:14.053 14:21:09 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 9 1 0x1 00:08:14.053 14:21:09 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=9 00:08:14.053 14:21:09 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:14.053 14:21:09 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:14.053 14:21:09 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:08:14.053 14:21:09 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_9.conf 00:08:14.053 14:21:09 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:14.053 14:21:09 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:14.053 14:21:09 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 9 00:08:14.053 14:21:09 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4409 00:08:14.053 14:21:09 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:08:14.053 14:21:09 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4409' 00:08:14.053 14:21:09 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4409"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:14.053 14:21:09 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:14.053 14:21:09 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:14.053 14:21:09 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4409' -c /tmp/fuzz_json_9.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 -Z 9 00:08:14.053 [2024-11-18 14:21:09.994670] Starting SPDK v25.01-pre git sha1 d47eb51c9 / DPDK 23.11.0 initialization... 00:08:14.053 [2024-11-18 14:21:09.994752] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid326444 ] 00:08:14.313 [2024-11-18 14:21:10.197811] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:14.313 [2024-11-18 14:21:10.211462] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:14.313 [2024-11-18 14:21:10.264231] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:14.313 [2024-11-18 14:21:10.280570] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4409 *** 00:08:14.313 INFO: Running with entropic power schedule (0xFF, 100). 00:08:14.313 INFO: Seed: 61852748 00:08:14.313 INFO: Loaded 1 modules (387495 inline 8-bit counters): 387495 [0x2acbe4c, 0x2b2a7f3), 00:08:14.313 INFO: Loaded 1 PC tables (387495 PCs): 387495 [0x2b2a7f8,0x3114268), 00:08:14.313 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:08:14.313 INFO: A corpus is not provided, starting from an empty corpus 00:08:14.313 [2024-11-18 14:21:10.347044] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.313 [2024-11-18 14:21:10.347081] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.313 #2 INITED cov: 12199 ft: 12204 corp: 1/1b exec/s: 0 rss: 71Mb 00:08:14.313 [2024-11-18 14:21:10.397479] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.313 [2024-11-18 14:21:10.397512] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.313 [2024-11-18 14:21:10.397663] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.313 [2024-11-18 14:21:10.397681] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.573 NEW_FUNC[1/1]: 0x1522d78 in nvmf_tcp_poll_group_poll /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/tcp.c:3555 00:08:14.573 #3 NEW cov: 12318 ft: 13549 corp: 2/3b lim: 5 exec/s: 0 rss: 72Mb L: 2/2 MS: 1 CrossOver- 00:08:14.833 [2024-11-18 14:21:10.727875] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.833 [2024-11-18 14:21:10.727911] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.833 #4 NEW cov: 12324 ft: 13755 corp: 3/4b lim: 5 exec/s: 0 rss: 72Mb L: 1/2 MS: 1 CopyPart- 00:08:14.833 [2024-11-18 14:21:10.768906] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.833 [2024-11-18 14:21:10.768933] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.833 [2024-11-18 14:21:10.769059] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.833 [2024-11-18 14:21:10.769077] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.833 [2024-11-18 14:21:10.769200] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.833 [2024-11-18 14:21:10.769216] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.833 [2024-11-18 14:21:10.769332] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.833 [2024-11-18 14:21:10.769348] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:14.833 [2024-11-18 14:21:10.769463] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.833 [2024-11-18 14:21:10.769480] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:14.833 #5 NEW cov: 12409 ft: 14371 corp: 4/9b lim: 5 exec/s: 0 rss: 72Mb L: 5/5 MS: 1 InsertRepeatedBytes- 00:08:14.833 [2024-11-18 14:21:10.808038] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.833 [2024-11-18 14:21:10.808063] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.833 #6 NEW cov: 12409 ft: 14623 corp: 5/10b lim: 5 exec/s: 0 rss: 72Mb L: 1/5 MS: 1 ShuffleBytes- 00:08:14.834 [2024-11-18 14:21:10.859153] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.834 [2024-11-18 14:21:10.859179] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.834 [2024-11-18 14:21:10.859308] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.834 [2024-11-18 14:21:10.859325] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.834 [2024-11-18 14:21:10.859436] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.834 [2024-11-18 14:21:10.859453] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.834 [2024-11-18 14:21:10.859572] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.834 [2024-11-18 14:21:10.859602] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:14.834 [2024-11-18 14:21:10.859729] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.834 [2024-11-18 14:21:10.859745] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:14.834 #7 NEW cov: 12409 ft: 14668 corp: 6/15b lim: 5 exec/s: 0 rss: 72Mb L: 5/5 MS: 1 ChangeBit- 00:08:14.834 [2024-11-18 14:21:10.929297] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.834 [2024-11-18 14:21:10.929325] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.834 [2024-11-18 14:21:10.929441] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.834 [2024-11-18 14:21:10.929459] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.834 [2024-11-18 14:21:10.929579] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.834 [2024-11-18 14:21:10.929607] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.834 [2024-11-18 14:21:10.929734] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.834 [2024-11-18 14:21:10.929750] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:14.834 [2024-11-18 14:21:10.929877] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.834 [2024-11-18 14:21:10.929893] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:15.094 #8 NEW cov: 12409 ft: 14754 corp: 7/20b lim: 5 exec/s: 0 rss: 73Mb L: 5/5 MS: 1 ChangeBit- 00:08:15.094 [2024-11-18 14:21:10.998495] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.094 [2024-11-18 14:21:10.998520] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.094 #9 NEW cov: 12409 ft: 14800 corp: 8/21b lim: 5 exec/s: 0 rss: 73Mb L: 1/5 MS: 1 ChangeByte- 00:08:15.094 [2024-11-18 14:21:11.049709] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.094 [2024-11-18 14:21:11.049736] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.094 [2024-11-18 14:21:11.049861] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.094 [2024-11-18 14:21:11.049878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.094 [2024-11-18 14:21:11.050008] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.094 [2024-11-18 14:21:11.050025] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.094 [2024-11-18 14:21:11.050151] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.094 [2024-11-18 14:21:11.050174] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:15.094 [2024-11-18 14:21:11.050290] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.094 [2024-11-18 14:21:11.050307] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:15.094 #10 NEW cov: 12409 ft: 14813 corp: 9/26b lim: 5 exec/s: 0 rss: 73Mb L: 5/5 MS: 1 InsertRepeatedBytes- 00:08:15.094 [2024-11-18 14:21:11.099593] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.094 [2024-11-18 14:21:11.099620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.094 [2024-11-18 14:21:11.099747] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.094 [2024-11-18 14:21:11.099763] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.094 [2024-11-18 14:21:11.099885] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.094 [2024-11-18 14:21:11.099902] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.094 [2024-11-18 14:21:11.100032] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.094 [2024-11-18 14:21:11.100048] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:15.094 #11 NEW cov: 12409 ft: 14874 corp: 10/30b lim: 5 exec/s: 0 rss: 73Mb L: 4/5 MS: 1 EraseBytes- 00:08:15.094 [2024-11-18 14:21:11.149482] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.094 [2024-11-18 14:21:11.149507] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.094 [2024-11-18 14:21:11.149625] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.094 [2024-11-18 14:21:11.149641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.094 [2024-11-18 14:21:11.149768] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.094 [2024-11-18 14:21:11.149784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.094 #12 NEW cov: 12409 ft: 15077 corp: 11/33b lim: 5 exec/s: 0 rss: 73Mb L: 3/5 MS: 1 EraseBytes- 00:08:15.094 [2024-11-18 14:21:11.219239] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.094 [2024-11-18 14:21:11.219266] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.354 NEW_FUNC[1/1]: 0x1c52b68 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:15.354 #13 NEW cov: 12432 ft: 15121 corp: 12/34b lim: 5 exec/s: 0 rss: 73Mb L: 1/5 MS: 1 ChangeBit- 00:08:15.354 [2024-11-18 14:21:11.269646] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.354 [2024-11-18 14:21:11.269677] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.354 [2024-11-18 14:21:11.269790] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.354 [2024-11-18 14:21:11.269806] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.354 #14 NEW cov: 12432 ft: 15174 corp: 13/36b lim: 5 exec/s: 0 rss: 73Mb L: 2/5 MS: 1 CopyPart- 00:08:15.354 [2024-11-18 14:21:11.340728] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.354 [2024-11-18 14:21:11.340755] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.354 [2024-11-18 14:21:11.340868] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.354 [2024-11-18 14:21:11.340885] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.354 [2024-11-18 14:21:11.340995] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.354 [2024-11-18 14:21:11.341011] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.354 [2024-11-18 14:21:11.341127] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.354 [2024-11-18 14:21:11.341142] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:15.354 [2024-11-18 14:21:11.341259] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.355 [2024-11-18 14:21:11.341276] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:15.355 #15 NEW cov: 12432 ft: 15255 corp: 14/41b lim: 5 exec/s: 15 rss: 73Mb L: 5/5 MS: 1 CopyPart- 00:08:15.355 [2024-11-18 14:21:11.390039] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.355 [2024-11-18 14:21:11.390065] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.355 [2024-11-18 14:21:11.390183] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.355 [2024-11-18 14:21:11.390200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.355 #16 NEW cov: 12432 ft: 15288 corp: 15/43b lim: 5 exec/s: 16 rss: 73Mb L: 2/5 MS: 1 CopyPart- 00:08:15.355 [2024-11-18 14:21:11.440777] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.355 [2024-11-18 14:21:11.440804] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.355 [2024-11-18 14:21:11.440935] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.355 [2024-11-18 14:21:11.440954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.355 [2024-11-18 14:21:11.441068] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.355 [2024-11-18 14:21:11.441089] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.355 [2024-11-18 14:21:11.441203] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.355 [2024-11-18 14:21:11.441219] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:15.615 #17 NEW cov: 12432 ft: 15332 corp: 16/47b lim: 5 exec/s: 17 rss: 73Mb L: 4/5 MS: 1 EraseBytes- 00:08:15.615 [2024-11-18 14:21:11.510137] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.615 [2024-11-18 14:21:11.510167] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.615 #18 NEW cov: 12432 ft: 15345 corp: 17/48b lim: 5 exec/s: 18 rss: 73Mb L: 1/5 MS: 1 ChangeByte- 00:08:15.615 [2024-11-18 14:21:11.580396] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.615 [2024-11-18 14:21:11.580428] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.615 #19 NEW cov: 12432 ft: 15378 corp: 18/49b lim: 5 exec/s: 19 rss: 73Mb L: 1/5 MS: 1 CrossOver- 00:08:15.615 [2024-11-18 14:21:11.631095] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.615 [2024-11-18 14:21:11.631124] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.615 [2024-11-18 14:21:11.631248] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.615 [2024-11-18 14:21:11.631266] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.615 [2024-11-18 14:21:11.631387] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.615 [2024-11-18 14:21:11.631403] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.615 #20 NEW cov: 12432 ft: 15405 corp: 19/52b lim: 5 exec/s: 20 rss: 73Mb L: 3/5 MS: 1 CrossOver- 00:08:15.615 [2024-11-18 14:21:11.701316] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.615 [2024-11-18 14:21:11.701344] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.615 [2024-11-18 14:21:11.701468] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.615 [2024-11-18 14:21:11.701485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.615 [2024-11-18 14:21:11.701610] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.615 [2024-11-18 14:21:11.701626] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.875 #21 NEW cov: 12432 ft: 15411 corp: 20/55b lim: 5 exec/s: 21 rss: 73Mb L: 3/5 MS: 1 ChangeBinInt- 00:08:15.875 [2024-11-18 14:21:11.771713] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.875 [2024-11-18 14:21:11.771744] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.875 [2024-11-18 14:21:11.771876] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.876 [2024-11-18 14:21:11.771894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.876 [2024-11-18 14:21:11.772010] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.876 [2024-11-18 14:21:11.772029] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.876 [2024-11-18 14:21:11.772147] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.876 [2024-11-18 14:21:11.772164] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:15.876 #22 NEW cov: 12432 ft: 15450 corp: 21/59b lim: 5 exec/s: 22 rss: 73Mb L: 4/5 MS: 1 CrossOver- 00:08:15.876 [2024-11-18 14:21:11.821368] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.876 [2024-11-18 14:21:11.821396] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.876 [2024-11-18 14:21:11.821521] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.876 [2024-11-18 14:21:11.821538] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.876 #23 NEW cov: 12432 ft: 15484 corp: 22/61b lim: 5 exec/s: 23 rss: 73Mb L: 2/5 MS: 1 ChangeBit- 00:08:15.876 [2024-11-18 14:21:11.881751] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.876 [2024-11-18 14:21:11.881778] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.876 [2024-11-18 14:21:11.881903] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.876 [2024-11-18 14:21:11.881919] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.876 [2024-11-18 14:21:11.882046] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.876 [2024-11-18 14:21:11.882061] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.876 #24 NEW cov: 12432 ft: 15492 corp: 23/64b lim: 5 exec/s: 24 rss: 73Mb L: 3/5 MS: 1 EraseBytes- 00:08:15.876 [2024-11-18 14:21:11.931642] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.876 [2024-11-18 14:21:11.931670] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.876 [2024-11-18 14:21:11.931800] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.876 [2024-11-18 14:21:11.931819] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.876 #25 NEW cov: 12432 ft: 15502 corp: 24/66b lim: 5 exec/s: 25 rss: 73Mb L: 2/5 MS: 1 ChangeBit- 00:08:15.876 [2024-11-18 14:21:11.972608] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.876 [2024-11-18 14:21:11.972634] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.876 [2024-11-18 14:21:11.972750] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.876 [2024-11-18 14:21:11.972767] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.876 [2024-11-18 14:21:11.972891] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.876 [2024-11-18 14:21:11.972909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.876 [2024-11-18 14:21:11.973027] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.876 [2024-11-18 14:21:11.973043] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:15.876 [2024-11-18 14:21:11.973158] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.876 [2024-11-18 14:21:11.973173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:15.876 #26 NEW cov: 12432 ft: 15539 corp: 25/71b lim: 5 exec/s: 26 rss: 73Mb L: 5/5 MS: 1 ChangeBinInt- 00:08:16.136 [2024-11-18 14:21:12.021687] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.136 [2024-11-18 14:21:12.021715] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.136 #27 NEW cov: 12432 ft: 15552 corp: 26/72b lim: 5 exec/s: 27 rss: 73Mb L: 1/5 MS: 1 ShuffleBytes- 00:08:16.136 [2024-11-18 14:21:12.092702] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.136 [2024-11-18 14:21:12.092728] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.136 [2024-11-18 14:21:12.092851] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.136 [2024-11-18 14:21:12.092867] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.136 [2024-11-18 14:21:12.092981] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.136 [2024-11-18 14:21:12.092996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:16.136 [2024-11-18 14:21:12.093116] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.136 [2024-11-18 14:21:12.093131] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:16.136 [2024-11-18 14:21:12.163176] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.136 [2024-11-18 14:21:12.163205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.136 [2024-11-18 14:21:12.163334] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.136 [2024-11-18 14:21:12.163350] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.136 [2024-11-18 14:21:12.163471] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.136 [2024-11-18 14:21:12.163486] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:16.136 [2024-11-18 14:21:12.163622] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.136 [2024-11-18 14:21:12.163638] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:16.136 [2024-11-18 14:21:12.163758] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.136 [2024-11-18 14:21:12.163774] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:16.136 #29 NEW cov: 12432 ft: 15587 corp: 27/77b lim: 5 exec/s: 29 rss: 74Mb L: 5/5 MS: 2 ShuffleBytes-CopyPart- 00:08:16.136 [2024-11-18 14:21:12.212484] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.136 [2024-11-18 14:21:12.212510] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.136 [2024-11-18 14:21:12.212647] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.136 [2024-11-18 14:21:12.212662] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.136 #30 NEW cov: 12432 ft: 15599 corp: 28/79b lim: 5 exec/s: 30 rss: 74Mb L: 2/5 MS: 1 CopyPart- 00:08:16.397 [2024-11-18 14:21:12.283523] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.397 [2024-11-18 14:21:12.283552] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.397 [2024-11-18 14:21:12.283675] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.397 [2024-11-18 14:21:12.283694] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.398 [2024-11-18 14:21:12.283816] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.398 [2024-11-18 14:21:12.283833] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:16.398 [2024-11-18 14:21:12.283959] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.398 [2024-11-18 14:21:12.283975] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:16.398 [2024-11-18 14:21:12.284093] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.398 [2024-11-18 14:21:12.284111] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:16.398 #31 NEW cov: 12432 ft: 15681 corp: 29/84b lim: 5 exec/s: 31 rss: 74Mb L: 5/5 MS: 1 ShuffleBytes- 00:08:16.398 [2024-11-18 14:21:12.333610] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.398 [2024-11-18 14:21:12.333636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.398 [2024-11-18 14:21:12.333763] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.398 [2024-11-18 14:21:12.333778] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.398 [2024-11-18 14:21:12.333905] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.398 [2024-11-18 14:21:12.333921] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:16.398 [2024-11-18 14:21:12.334045] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.398 [2024-11-18 14:21:12.334063] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:16.398 [2024-11-18 14:21:12.334184] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.398 [2024-11-18 14:21:12.334200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:16.398 #32 pulse cov: 12432 ft: 15688 corp: 29/84b lim: 5 exec/s: 16 rss: 74Mb 00:08:16.398 #32 NEW cov: 12432 ft: 15688 corp: 30/89b lim: 5 exec/s: 16 rss: 74Mb L: 5/5 MS: 1 ChangeByte- 00:08:16.398 #32 DONE cov: 12432 ft: 15688 corp: 30/89b lim: 5 exec/s: 16 rss: 74Mb 00:08:16.398 Done 32 runs in 2 second(s) 00:08:16.398 14:21:12 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_9.conf /var/tmp/suppress_nvmf_fuzz 00:08:16.398 14:21:12 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:16.398 14:21:12 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:16.398 14:21:12 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 10 1 0x1 00:08:16.398 14:21:12 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=10 00:08:16.398 14:21:12 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:16.398 14:21:12 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:16.398 14:21:12 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:08:16.398 14:21:12 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_10.conf 00:08:16.398 14:21:12 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:16.398 14:21:12 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:16.398 14:21:12 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 10 00:08:16.398 14:21:12 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4410 00:08:16.398 14:21:12 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:08:16.398 14:21:12 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4410' 00:08:16.398 14:21:12 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4410"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:16.398 14:21:12 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:16.398 14:21:12 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:16.398 14:21:12 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4410' -c /tmp/fuzz_json_10.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 -Z 10 00:08:16.398 [2024-11-18 14:21:12.518321] Starting SPDK v25.01-pre git sha1 d47eb51c9 / DPDK 23.11.0 initialization... 00:08:16.398 [2024-11-18 14:21:12.518393] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid326932 ] 00:08:16.659 [2024-11-18 14:21:12.733954] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:16.659 [2024-11-18 14:21:12.747800] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:16.919 [2024-11-18 14:21:12.800410] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:16.919 [2024-11-18 14:21:12.816716] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4410 *** 00:08:16.919 INFO: Running with entropic power schedule (0xFF, 100). 00:08:16.919 INFO: Seed: 2597835346 00:08:16.919 INFO: Loaded 1 modules (387495 inline 8-bit counters): 387495 [0x2acbe4c, 0x2b2a7f3), 00:08:16.919 INFO: Loaded 1 PC tables (387495 PCs): 387495 [0x2b2a7f8,0x3114268), 00:08:16.919 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:08:16.919 INFO: A corpus is not provided, starting from an empty corpus 00:08:16.919 #2 INITED exec/s: 0 rss: 65Mb 00:08:16.919 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:16.919 This may also happen if the target rejected all inputs we tried so far 00:08:16.919 [2024-11-18 14:21:12.882147] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:8b8b8b8b cdw11:8b8b8b8b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:16.919 [2024-11-18 14:21:12.882175] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.179 NEW_FUNC[1/715]: 0x466508 in fuzz_admin_security_receive_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:205 00:08:17.179 NEW_FUNC[2/715]: 0x495e28 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:17.179 #7 NEW cov: 12210 ft: 12202 corp: 2/14b lim: 40 exec/s: 0 rss: 72Mb L: 13/13 MS: 5 ShuffleBytes-ChangeBit-CopyPart-CopyPart-InsertRepeatedBytes- 00:08:17.179 [2024-11-18 14:21:13.213139] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:8b8b8b8b cdw11:8b8b8b8b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.179 [2024-11-18 14:21:13.213194] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.179 #8 NEW cov: 12340 ft: 13010 corp: 3/27b lim: 40 exec/s: 0 rss: 72Mb L: 13/13 MS: 1 ShuffleBytes- 00:08:17.180 [2024-11-18 14:21:13.283077] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:8b8b8b8b cdw11:8b8b8b8b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.180 [2024-11-18 14:21:13.283104] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.440 #9 NEW cov: 12346 ft: 13292 corp: 4/38b lim: 40 exec/s: 0 rss: 72Mb L: 11/13 MS: 1 EraseBytes- 00:08:17.440 [2024-11-18 14:21:13.343202] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:8f8b8b8b cdw11:8b8b8b8b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.440 [2024-11-18 14:21:13.343227] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.440 #10 NEW cov: 12431 ft: 13576 corp: 5/51b lim: 40 exec/s: 0 rss: 72Mb L: 13/13 MS: 1 ChangeByte- 00:08:17.440 [2024-11-18 14:21:13.383358] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:8b8b8b8b cdw11:8beb8b8b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.440 [2024-11-18 14:21:13.383387] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.440 #11 NEW cov: 12431 ft: 13657 corp: 6/64b lim: 40 exec/s: 0 rss: 72Mb L: 13/13 MS: 1 ChangeByte- 00:08:17.440 [2024-11-18 14:21:13.423780] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:8f8b8b00 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.440 [2024-11-18 14:21:13.423804] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.440 [2024-11-18 14:21:13.423864] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.440 [2024-11-18 14:21:13.423877] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.440 [2024-11-18 14:21:13.423935] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.440 [2024-11-18 14:21:13.423948] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.440 [2024-11-18 14:21:13.424005] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:8b8b8b8b cdw11:8b8b8b8b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.440 [2024-11-18 14:21:13.424018] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:17.440 #12 NEW cov: 12431 ft: 14352 corp: 7/98b lim: 40 exec/s: 0 rss: 72Mb L: 34/34 MS: 1 InsertRepeatedBytes- 00:08:17.440 [2024-11-18 14:21:13.483628] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:8b8b8b8b cdw11:8b8b8b8b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.440 [2024-11-18 14:21:13.483654] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.440 #13 NEW cov: 12431 ft: 14466 corp: 8/111b lim: 40 exec/s: 0 rss: 72Mb L: 13/34 MS: 1 ShuffleBytes- 00:08:17.440 [2024-11-18 14:21:13.523729] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:8b8b8b8b cdw11:8b8b8f8b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.440 [2024-11-18 14:21:13.523754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.440 #14 NEW cov: 12431 ft: 14474 corp: 9/124b lim: 40 exec/s: 0 rss: 72Mb L: 13/34 MS: 1 ShuffleBytes- 00:08:17.440 [2024-11-18 14:21:13.564219] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:e6e6e6e6 cdw11:e6e6e6e6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.440 [2024-11-18 14:21:13.564244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.440 [2024-11-18 14:21:13.564306] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:e6e6e6e6 cdw11:e6e6e6e6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.440 [2024-11-18 14:21:13.564320] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.440 [2024-11-18 14:21:13.564377] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:e6e6e6e6 cdw11:e6e6e6e6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.440 [2024-11-18 14:21:13.564391] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.440 [2024-11-18 14:21:13.564451] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:e6e6e6e6 cdw11:e6e6db0a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.440 [2024-11-18 14:21:13.564464] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:17.701 #18 NEW cov: 12431 ft: 14497 corp: 10/156b lim: 40 exec/s: 0 rss: 72Mb L: 32/34 MS: 4 ShuffleBytes-InsertByte-ChangeBit-InsertRepeatedBytes- 00:08:17.701 [2024-11-18 14:21:13.603932] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:01008b8b cdw11:8b8b8b8b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.701 [2024-11-18 14:21:13.603957] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.701 #19 NEW cov: 12431 ft: 14559 corp: 11/169b lim: 40 exec/s: 0 rss: 72Mb L: 13/34 MS: 1 CMP- DE: "\001\000"- 00:08:17.701 [2024-11-18 14:21:13.644415] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:8f8b8b00 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.701 [2024-11-18 14:21:13.644440] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.701 [2024-11-18 14:21:13.644500] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:01000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.701 [2024-11-18 14:21:13.644514] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.701 [2024-11-18 14:21:13.644576] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.701 [2024-11-18 14:21:13.644589] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.701 [2024-11-18 14:21:13.644648] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:8b8b8b8b cdw11:8b8b8b8b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.701 [2024-11-18 14:21:13.644661] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:17.701 #20 NEW cov: 12431 ft: 14579 corp: 12/203b lim: 40 exec/s: 0 rss: 73Mb L: 34/34 MS: 1 PersAutoDict- DE: "\001\000"- 00:08:17.701 [2024-11-18 14:21:13.704589] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:e6e6e6e6 cdw11:e6e6e6e6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.701 [2024-11-18 14:21:13.704613] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.701 [2024-11-18 14:21:13.704691] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:e6e6e6e6 cdw11:e6e6e6e6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.701 [2024-11-18 14:21:13.704705] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.701 [2024-11-18 14:21:13.704762] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:e6e6e6e6 cdw11:e6e6e6e6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.701 [2024-11-18 14:21:13.704775] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.701 [2024-11-18 14:21:13.704834] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:e6e6e6e6 cdw11:e6e6db0a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.702 [2024-11-18 14:21:13.704847] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:17.702 #21 NEW cov: 12431 ft: 14598 corp: 13/235b lim: 40 exec/s: 0 rss: 73Mb L: 32/34 MS: 1 ShuffleBytes- 00:08:17.702 [2024-11-18 14:21:13.764421] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:8b8b8b8b cdw11:8b8b8b8b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.702 [2024-11-18 14:21:13.764446] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.702 NEW_FUNC[1/1]: 0x1c52b68 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:17.702 #22 NEW cov: 12454 ft: 14673 corp: 14/248b lim: 40 exec/s: 0 rss: 73Mb L: 13/34 MS: 1 CrossOver- 00:08:17.702 [2024-11-18 14:21:13.824843] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:8b8b8bea cdw11:eaeaeaea SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.702 [2024-11-18 14:21:13.824868] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.702 [2024-11-18 14:21:13.824946] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:eaeaeaea cdw11:eaeaeaea SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.702 [2024-11-18 14:21:13.824961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.702 [2024-11-18 14:21:13.825020] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:8b8beb8b cdw11:8b8b8b8b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.702 [2024-11-18 14:21:13.825034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.962 #23 NEW cov: 12454 ft: 14897 corp: 15/274b lim: 40 exec/s: 23 rss: 73Mb L: 26/34 MS: 1 InsertRepeatedBytes- 00:08:17.962 [2024-11-18 14:21:13.885107] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:e6e6e6e6 cdw11:e6e6e6e6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.962 [2024-11-18 14:21:13.885132] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.962 [2024-11-18 14:21:13.885194] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000020 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.962 [2024-11-18 14:21:13.885208] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.962 [2024-11-18 14:21:13.885270] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:e6e6e6e6 cdw11:e6e6e6e6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.962 [2024-11-18 14:21:13.885283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.962 [2024-11-18 14:21:13.885342] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:e6e6e6e6 cdw11:e6e6db0a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.962 [2024-11-18 14:21:13.885355] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:17.962 #24 NEW cov: 12454 ft: 14909 corp: 16/306b lim: 40 exec/s: 24 rss: 73Mb L: 32/34 MS: 1 ChangeBinInt- 00:08:17.962 [2024-11-18 14:21:13.924857] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:8b8b8b8b cdw11:8b8b8b8b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.962 [2024-11-18 14:21:13.924883] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.962 #25 NEW cov: 12454 ft: 14931 corp: 17/317b lim: 40 exec/s: 25 rss: 73Mb L: 11/34 MS: 1 ShuffleBytes- 00:08:17.962 [2024-11-18 14:21:13.985108] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:8b8b8b8b cdw11:8b8b8b8b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.962 [2024-11-18 14:21:13.985133] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.962 #26 NEW cov: 12454 ft: 14958 corp: 18/330b lim: 40 exec/s: 26 rss: 73Mb L: 13/34 MS: 1 ShuffleBytes- 00:08:17.962 [2024-11-18 14:21:14.025233] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:01008b8b cdw11:8b8b8b8b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.962 [2024-11-18 14:21:14.025259] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.962 #27 NEW cov: 12454 ft: 14989 corp: 19/343b lim: 40 exec/s: 27 rss: 73Mb L: 13/34 MS: 1 ShuffleBytes- 00:08:17.962 [2024-11-18 14:21:14.085542] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:8b8b8b8b cdw11:8b8b8b8b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.962 [2024-11-18 14:21:14.085572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.962 [2024-11-18 14:21:14.085631] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:8b8b8b8b cdw11:1a8b8b8b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.962 [2024-11-18 14:21:14.085646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.223 #28 NEW cov: 12454 ft: 15183 corp: 20/362b lim: 40 exec/s: 28 rss: 73Mb L: 19/34 MS: 1 CopyPart- 00:08:18.223 [2024-11-18 14:21:14.145796] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a858585 cdw11:85858585 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.223 [2024-11-18 14:21:14.145822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.223 [2024-11-18 14:21:14.145883] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:85858585 cdw11:85858585 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.223 [2024-11-18 14:21:14.145896] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.223 [2024-11-18 14:21:14.145957] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:85858585 cdw11:85858585 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.223 [2024-11-18 14:21:14.145970] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:18.223 #30 NEW cov: 12454 ft: 15187 corp: 21/393b lim: 40 exec/s: 30 rss: 73Mb L: 31/34 MS: 2 ShuffleBytes-InsertRepeatedBytes- 00:08:18.223 [2024-11-18 14:21:14.185680] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:8b8b8b8b cdw11:8b8b8b8b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.223 [2024-11-18 14:21:14.185704] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.223 #31 NEW cov: 12454 ft: 15232 corp: 22/406b lim: 40 exec/s: 31 rss: 73Mb L: 13/34 MS: 1 ChangeBit- 00:08:18.223 [2024-11-18 14:21:14.225778] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:06000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.223 [2024-11-18 14:21:14.225803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.223 #32 NEW cov: 12454 ft: 15243 corp: 23/419b lim: 40 exec/s: 32 rss: 73Mb L: 13/34 MS: 1 CMP- DE: "\006\000\000\000\000\000\000\000"- 00:08:18.223 [2024-11-18 14:21:14.266316] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:e6e6e6e6 cdw11:e6e6e6e6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.223 [2024-11-18 14:21:14.266340] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.223 [2024-11-18 14:21:14.266418] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:e6e6e6e6 cdw11:66e6e6e6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.223 [2024-11-18 14:21:14.266433] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.223 [2024-11-18 14:21:14.266492] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:e6e6e6e6 cdw11:e6e6e6e6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.223 [2024-11-18 14:21:14.266505] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:18.223 [2024-11-18 14:21:14.266571] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:e6e6e6e6 cdw11:e6e6db0a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.223 [2024-11-18 14:21:14.266589] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:18.223 #33 NEW cov: 12454 ft: 15272 corp: 24/451b lim: 40 exec/s: 33 rss: 73Mb L: 32/34 MS: 1 ChangeBit- 00:08:18.223 [2024-11-18 14:21:14.326181] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:06000000 cdw11:008b0000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.223 [2024-11-18 14:21:14.326205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.223 [2024-11-18 14:21:14.326282] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:0000008b cdw11:8b8b8b1a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.223 [2024-11-18 14:21:14.326297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.484 #34 NEW cov: 12454 ft: 15303 corp: 25/467b lim: 40 exec/s: 34 rss: 74Mb L: 16/34 MS: 1 CopyPart- 00:08:18.484 [2024-11-18 14:21:14.386244] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:b7060000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.484 [2024-11-18 14:21:14.386269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.484 #37 NEW cov: 12454 ft: 15304 corp: 26/476b lim: 40 exec/s: 37 rss: 74Mb L: 9/34 MS: 3 ChangeByte-ChangeByte-PersAutoDict- DE: "\006\000\000\000\000\000\000\000"- 00:08:18.484 [2024-11-18 14:21:14.426374] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:8b8b8b8b cdw11:8b8b8b8b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.484 [2024-11-18 14:21:14.426399] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.484 #38 NEW cov: 12454 ft: 15313 corp: 27/489b lim: 40 exec/s: 38 rss: 74Mb L: 13/34 MS: 1 PersAutoDict- DE: "\001\000"- 00:08:18.484 [2024-11-18 14:21:14.486581] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:8b898b8b cdw11:8b8b8b8b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.484 [2024-11-18 14:21:14.486606] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.484 #39 NEW cov: 12454 ft: 15335 corp: 28/502b lim: 40 exec/s: 39 rss: 74Mb L: 13/34 MS: 1 ChangeBit- 00:08:18.484 [2024-11-18 14:21:14.547080] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:8f8b8b00 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.484 [2024-11-18 14:21:14.547105] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.484 [2024-11-18 14:21:14.547164] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:01000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.484 [2024-11-18 14:21:14.547178] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.484 [2024-11-18 14:21:14.547236] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.484 [2024-11-18 14:21:14.547250] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:18.485 [2024-11-18 14:21:14.547309] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:8b8b8b8b cdw11:8b8b9b8b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.485 [2024-11-18 14:21:14.547323] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:18.485 #40 NEW cov: 12454 ft: 15346 corp: 29/536b lim: 40 exec/s: 40 rss: 74Mb L: 34/34 MS: 1 ChangeBit- 00:08:18.485 [2024-11-18 14:21:14.607246] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:e6e6e6e6 cdw11:e6e6e6e6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.485 [2024-11-18 14:21:14.607272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.485 [2024-11-18 14:21:14.607333] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:e6e6e6e6 cdw11:66e6e6e6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.485 [2024-11-18 14:21:14.607348] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.485 [2024-11-18 14:21:14.607408] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:e6e6e6e6 cdw11:e6e6e6e6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.485 [2024-11-18 14:21:14.607423] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:18.485 [2024-11-18 14:21:14.607481] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:e6e6e6e6 cdw11:e6e6db0a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.485 [2024-11-18 14:21:14.607494] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:18.745 #41 NEW cov: 12454 ft: 15396 corp: 30/568b lim: 40 exec/s: 41 rss: 74Mb L: 32/34 MS: 1 ShuffleBytes- 00:08:18.745 [2024-11-18 14:21:14.667272] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:06000000 cdw11:008b0000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.745 [2024-11-18 14:21:14.667297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.745 [2024-11-18 14:21:14.667359] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:0000008b cdw11:8b8b0000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.745 [2024-11-18 14:21:14.667373] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.745 [2024-11-18 14:21:14.667431] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:0000008b cdw11:8b8b8b1a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.745 [2024-11-18 14:21:14.667444] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:18.745 #42 NEW cov: 12454 ft: 15400 corp: 31/592b lim: 40 exec/s: 42 rss: 74Mb L: 24/34 MS: 1 CopyPart- 00:08:18.745 [2024-11-18 14:21:14.727220] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:8b8b0600 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.745 [2024-11-18 14:21:14.727244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.745 #43 NEW cov: 12454 ft: 15440 corp: 32/603b lim: 40 exec/s: 43 rss: 74Mb L: 11/34 MS: 1 PersAutoDict- DE: "\006\000\000\000\000\000\000\000"- 00:08:18.745 [2024-11-18 14:21:14.787733] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:e6e6e6e6 cdw11:e6e6e6e6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.745 [2024-11-18 14:21:14.787758] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.745 [2024-11-18 14:21:14.787815] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:e6e6e6e6 cdw11:66e6e6e6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.745 [2024-11-18 14:21:14.787829] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.745 [2024-11-18 14:21:14.787886] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:e6e6e6e6 cdw11:e6e666e6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.745 [2024-11-18 14:21:14.787900] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:18.746 [2024-11-18 14:21:14.787961] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:e6e6e6e6 cdw11:e6e6e6e6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.746 [2024-11-18 14:21:14.787974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:18.746 #44 NEW cov: 12454 ft: 15453 corp: 33/641b lim: 40 exec/s: 44 rss: 74Mb L: 38/38 MS: 1 CopyPart- 00:08:18.746 [2024-11-18 14:21:14.827455] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:8b8b8b8b cdw11:8b8b908b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.746 [2024-11-18 14:21:14.827481] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.746 #45 NEW cov: 12454 ft: 15466 corp: 34/654b lim: 40 exec/s: 45 rss: 74Mb L: 13/38 MS: 1 ChangeBinInt- 00:08:18.746 [2024-11-18 14:21:14.867752] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:8b8b8b8b cdw11:8b8b8b8b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.746 [2024-11-18 14:21:14.867777] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.746 [2024-11-18 14:21:14.867853] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:8b8b1a91 cdw11:91919191 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.746 [2024-11-18 14:21:14.867867] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:19.007 #46 NEW cov: 12454 ft: 15470 corp: 35/676b lim: 40 exec/s: 23 rss: 74Mb L: 22/38 MS: 1 InsertRepeatedBytes- 00:08:19.007 #46 DONE cov: 12454 ft: 15470 corp: 35/676b lim: 40 exec/s: 23 rss: 74Mb 00:08:19.007 ###### Recommended dictionary. ###### 00:08:19.007 "\001\000" # Uses: 2 00:08:19.007 "\006\000\000\000\000\000\000\000" # Uses: 2 00:08:19.007 ###### End of recommended dictionary. ###### 00:08:19.007 Done 46 runs in 2 second(s) 00:08:19.007 14:21:14 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_10.conf /var/tmp/suppress_nvmf_fuzz 00:08:19.007 14:21:14 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:19.007 14:21:14 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:19.007 14:21:14 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 11 1 0x1 00:08:19.007 14:21:14 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=11 00:08:19.007 14:21:14 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:19.007 14:21:14 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:19.007 14:21:14 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:08:19.007 14:21:14 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_11.conf 00:08:19.007 14:21:14 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:19.007 14:21:14 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:19.007 14:21:14 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 11 00:08:19.007 14:21:14 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4411 00:08:19.007 14:21:14 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:08:19.007 14:21:14 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4411' 00:08:19.007 14:21:15 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4411"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:19.007 14:21:15 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:19.007 14:21:15 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:19.007 14:21:15 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4411' -c /tmp/fuzz_json_11.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 -Z 11 00:08:19.007 [2024-11-18 14:21:15.033977] Starting SPDK v25.01-pre git sha1 d47eb51c9 / DPDK 23.11.0 initialization... 00:08:19.007 [2024-11-18 14:21:15.034044] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid327447 ] 00:08:19.267 [2024-11-18 14:21:15.241871] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:19.267 [2024-11-18 14:21:15.254899] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:19.267 [2024-11-18 14:21:15.307237] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:19.267 [2024-11-18 14:21:15.323561] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4411 *** 00:08:19.267 INFO: Running with entropic power schedule (0xFF, 100). 00:08:19.267 INFO: Seed: 809877256 00:08:19.267 INFO: Loaded 1 modules (387495 inline 8-bit counters): 387495 [0x2acbe4c, 0x2b2a7f3), 00:08:19.267 INFO: Loaded 1 PC tables (387495 PCs): 387495 [0x2b2a7f8,0x3114268), 00:08:19.267 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:08:19.268 INFO: A corpus is not provided, starting from an empty corpus 00:08:19.268 #2 INITED exec/s: 0 rss: 65Mb 00:08:19.268 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:19.268 This may also happen if the target rejected all inputs we tried so far 00:08:19.268 [2024-11-18 14:21:15.379247] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.268 [2024-11-18 14:21:15.379273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.268 [2024-11-18 14:21:15.379334] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.268 [2024-11-18 14:21:15.379348] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:19.268 [2024-11-18 14:21:15.379404] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.268 [2024-11-18 14:21:15.379417] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:19.788 NEW_FUNC[1/716]: 0x468278 in fuzz_admin_security_send_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:223 00:08:19.788 NEW_FUNC[2/716]: 0x495e28 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:19.788 #42 NEW cov: 12237 ft: 12237 corp: 2/30b lim: 40 exec/s: 0 rss: 73Mb L: 29/29 MS: 5 CopyPart-ChangeByte-ChangeBit-CopyPart-InsertRepeatedBytes- 00:08:19.788 [2024-11-18 14:21:15.710120] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.788 [2024-11-18 14:21:15.710172] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.788 [2024-11-18 14:21:15.710267] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.788 [2024-11-18 14:21:15.710294] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:19.788 #43 NEW cov: 12353 ft: 13203 corp: 3/50b lim: 40 exec/s: 0 rss: 73Mb L: 20/29 MS: 1 EraseBytes- 00:08:19.788 [2024-11-18 14:21:15.780205] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffff25 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.788 [2024-11-18 14:21:15.780234] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.788 [2024-11-18 14:21:15.780292] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.788 [2024-11-18 14:21:15.780306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:19.788 [2024-11-18 14:21:15.780360] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.788 [2024-11-18 14:21:15.780373] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:19.788 #44 NEW cov: 12359 ft: 13419 corp: 4/80b lim: 40 exec/s: 0 rss: 73Mb L: 30/30 MS: 1 InsertByte- 00:08:19.788 [2024-11-18 14:21:15.820298] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.788 [2024-11-18 14:21:15.820322] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.788 [2024-11-18 14:21:15.820377] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.788 [2024-11-18 14:21:15.820390] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:19.788 [2024-11-18 14:21:15.820444] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.788 [2024-11-18 14:21:15.820458] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:19.788 #45 NEW cov: 12444 ft: 13676 corp: 5/106b lim: 40 exec/s: 0 rss: 73Mb L: 26/30 MS: 1 EraseBytes- 00:08:19.788 [2024-11-18 14:21:15.860563] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffff25 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.788 [2024-11-18 14:21:15.860588] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.788 [2024-11-18 14:21:15.860663] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffff05 cdw11:00ffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.788 [2024-11-18 14:21:15.860677] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:19.788 [2024-11-18 14:21:15.860734] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.788 [2024-11-18 14:21:15.860748] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:19.788 [2024-11-18 14:21:15.860802] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ff0a680a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.789 [2024-11-18 14:21:15.860815] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:19.789 #46 NEW cov: 12444 ft: 14042 corp: 6/138b lim: 40 exec/s: 0 rss: 73Mb L: 32/32 MS: 1 CMP- DE: "\005\000"- 00:08:20.049 [2024-11-18 14:21:15.920449] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.049 [2024-11-18 14:21:15.920475] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.049 [2024-11-18 14:21:15.920533] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0a68 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.049 [2024-11-18 14:21:15.920556] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.049 #47 NEW cov: 12444 ft: 14134 corp: 7/155b lim: 40 exec/s: 0 rss: 73Mb L: 17/32 MS: 1 EraseBytes- 00:08:20.049 [2024-11-18 14:21:15.960864] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.049 [2024-11-18 14:21:15.960888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.049 [2024-11-18 14:21:15.960962] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.049 [2024-11-18 14:21:15.960976] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.049 [2024-11-18 14:21:15.961031] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffe7e7e7 cdw11:e7ffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.049 [2024-11-18 14:21:15.961044] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:20.049 [2024-11-18 14:21:15.961101] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0a68 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.049 [2024-11-18 14:21:15.961114] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:20.049 #53 NEW cov: 12444 ft: 14173 corp: 8/188b lim: 40 exec/s: 0 rss: 73Mb L: 33/33 MS: 1 InsertRepeatedBytes- 00:08:20.050 [2024-11-18 14:21:16.000971] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.050 [2024-11-18 14:21:16.000996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.050 [2024-11-18 14:21:16.001052] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.050 [2024-11-18 14:21:16.001065] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.050 [2024-11-18 14:21:16.001120] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffe7e7e7 cdw11:e7ffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.050 [2024-11-18 14:21:16.001134] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:20.050 [2024-11-18 14:21:16.001189] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0a68 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.050 [2024-11-18 14:21:16.001203] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:20.050 #54 NEW cov: 12444 ft: 14188 corp: 9/221b lim: 40 exec/s: 0 rss: 73Mb L: 33/33 MS: 1 ShuffleBytes- 00:08:20.050 [2024-11-18 14:21:16.061149] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.050 [2024-11-18 14:21:16.061174] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.050 [2024-11-18 14:21:16.061229] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.050 [2024-11-18 14:21:16.061242] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.050 [2024-11-18 14:21:16.061297] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffe7e7e7 cdw11:e7ffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.050 [2024-11-18 14:21:16.061313] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:20.050 [2024-11-18 14:21:16.061368] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:00000200 cdw11:ffff0a68 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.050 [2024-11-18 14:21:16.061381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:20.050 #55 NEW cov: 12444 ft: 14211 corp: 10/254b lim: 40 exec/s: 0 rss: 73Mb L: 33/33 MS: 1 CMP- DE: "\000\000\002\000"- 00:08:20.050 [2024-11-18 14:21:16.101111] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.050 [2024-11-18 14:21:16.101137] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.050 [2024-11-18 14:21:16.101193] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffe7e7e7 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.050 [2024-11-18 14:21:16.101208] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.050 [2024-11-18 14:21:16.101264] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:e7ffffff cdw11:00000200 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.050 [2024-11-18 14:21:16.101278] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:20.050 #56 NEW cov: 12444 ft: 14291 corp: 11/283b lim: 40 exec/s: 0 rss: 73Mb L: 29/33 MS: 1 EraseBytes- 00:08:20.050 [2024-11-18 14:21:16.161112] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.050 [2024-11-18 14:21:16.161137] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.050 [2024-11-18 14:21:16.161195] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.050 [2024-11-18 14:21:16.161208] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.311 #57 NEW cov: 12444 ft: 14315 corp: 12/303b lim: 40 exec/s: 0 rss: 74Mb L: 20/33 MS: 1 CopyPart- 00:08:20.311 [2024-11-18 14:21:16.221484] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.311 [2024-11-18 14:21:16.221509] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.311 [2024-11-18 14:21:16.221581] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffe7e7e7 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.311 [2024-11-18 14:21:16.221596] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.311 [2024-11-18 14:21:16.221653] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:e7ffffff cdw11:00000200 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.311 [2024-11-18 14:21:16.221667] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:20.311 NEW_FUNC[1/1]: 0x1c52b68 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:20.311 #58 NEW cov: 12467 ft: 14384 corp: 13/332b lim: 40 exec/s: 0 rss: 74Mb L: 29/33 MS: 1 CrossOver- 00:08:20.311 [2024-11-18 14:21:16.281744] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.311 [2024-11-18 14:21:16.281769] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.311 [2024-11-18 14:21:16.281831] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffe7 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.311 [2024-11-18 14:21:16.281845] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.311 [2024-11-18 14:21:16.281902] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffe7e7 cdw11:e7ffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.311 [2024-11-18 14:21:16.281915] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:20.311 [2024-11-18 14:21:16.281971] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0a68 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.311 [2024-11-18 14:21:16.281984] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:20.311 #59 NEW cov: 12467 ft: 14446 corp: 14/365b lim: 40 exec/s: 0 rss: 74Mb L: 33/33 MS: 1 ShuffleBytes- 00:08:20.311 [2024-11-18 14:21:16.321700] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.311 [2024-11-18 14:21:16.321724] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.311 [2024-11-18 14:21:16.321799] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.311 [2024-11-18 14:21:16.321813] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.311 [2024-11-18 14:21:16.321874] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.311 [2024-11-18 14:21:16.321887] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:20.311 #60 NEW cov: 12467 ft: 14483 corp: 15/394b lim: 40 exec/s: 0 rss: 74Mb L: 29/33 MS: 1 CopyPart- 00:08:20.311 [2024-11-18 14:21:16.361975] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffff25 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.311 [2024-11-18 14:21:16.361999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.311 [2024-11-18 14:21:16.362060] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.311 [2024-11-18 14:21:16.362073] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.311 [2024-11-18 14:21:16.362130] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.311 [2024-11-18 14:21:16.362143] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:20.311 [2024-11-18 14:21:16.362200] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffff0a cdw11:68000002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.311 [2024-11-18 14:21:16.362213] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:20.311 #61 NEW cov: 12467 ft: 14504 corp: 16/428b lim: 40 exec/s: 61 rss: 74Mb L: 34/34 MS: 1 PersAutoDict- DE: "\000\000\002\000"- 00:08:20.311 [2024-11-18 14:21:16.401913] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ff0500ff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.311 [2024-11-18 14:21:16.401940] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.311 [2024-11-18 14:21:16.402002] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffe7 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.311 [2024-11-18 14:21:16.402016] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.311 [2024-11-18 14:21:16.402072] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:e7e7e7ff cdw11:ffff0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.311 [2024-11-18 14:21:16.402085] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:20.311 #62 NEW cov: 12467 ft: 14552 corp: 17/459b lim: 40 exec/s: 62 rss: 74Mb L: 31/34 MS: 1 PersAutoDict- DE: "\005\000"- 00:08:20.572 [2024-11-18 14:21:16.442197] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.572 [2024-11-18 14:21:16.442222] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.572 [2024-11-18 14:21:16.442279] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffff00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.572 [2024-11-18 14:21:16.442308] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.572 [2024-11-18 14:21:16.442367] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:000000ff cdw11:ffe7e7e7 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.572 [2024-11-18 14:21:16.442381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:20.572 [2024-11-18 14:21:16.442442] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:e7ffffff cdw11:00000200 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.572 [2024-11-18 14:21:16.442455] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:20.572 #63 NEW cov: 12467 ft: 14584 corp: 18/496b lim: 40 exec/s: 63 rss: 74Mb L: 37/37 MS: 1 InsertRepeatedBytes- 00:08:20.572 [2024-11-18 14:21:16.502399] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:cfffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.572 [2024-11-18 14:21:16.502424] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.572 [2024-11-18 14:21:16.502499] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:25ffffff cdw11:0500ffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.572 [2024-11-18 14:21:16.502513] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.572 [2024-11-18 14:21:16.502573] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.572 [2024-11-18 14:21:16.502587] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:20.572 [2024-11-18 14:21:16.502644] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0a68 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.572 [2024-11-18 14:21:16.502657] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:20.572 #64 NEW cov: 12467 ft: 14606 corp: 19/529b lim: 40 exec/s: 64 rss: 74Mb L: 33/37 MS: 1 InsertByte- 00:08:20.572 [2024-11-18 14:21:16.562533] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.572 [2024-11-18 14:21:16.562572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.572 [2024-11-18 14:21:16.562646] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ff000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.572 [2024-11-18 14:21:16.562660] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.572 [2024-11-18 14:21:16.562719] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:21e7e7e7 cdw11:e7ffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.572 [2024-11-18 14:21:16.562732] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:20.572 [2024-11-18 14:21:16.562787] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:00000200 cdw11:ffff0a68 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.572 [2024-11-18 14:21:16.562801] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:20.572 #65 NEW cov: 12467 ft: 14667 corp: 20/562b lim: 40 exec/s: 65 rss: 74Mb L: 33/37 MS: 1 ChangeBinInt- 00:08:20.572 [2024-11-18 14:21:16.602660] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ff32ffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.572 [2024-11-18 14:21:16.602684] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.572 [2024-11-18 14:21:16.602743] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffff00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.572 [2024-11-18 14:21:16.602756] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.572 [2024-11-18 14:21:16.602811] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:000000ff cdw11:ffe7e7e7 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.572 [2024-11-18 14:21:16.602824] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:20.572 [2024-11-18 14:21:16.602881] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:e7ffffff cdw11:00000200 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.572 [2024-11-18 14:21:16.602894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:20.572 #66 NEW cov: 12467 ft: 14672 corp: 21/599b lim: 40 exec/s: 66 rss: 74Mb L: 37/37 MS: 1 ChangeByte- 00:08:20.572 [2024-11-18 14:21:16.662846] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:21ffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.572 [2024-11-18 14:21:16.662871] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.572 [2024-11-18 14:21:16.662943] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.572 [2024-11-18 14:21:16.662957] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.572 [2024-11-18 14:21:16.663014] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffe7e7e7 cdw11:e7ffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.572 [2024-11-18 14:21:16.663026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:20.572 [2024-11-18 14:21:16.663084] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0a68 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.572 [2024-11-18 14:21:16.663097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:20.832 #67 NEW cov: 12467 ft: 14743 corp: 22/632b lim: 40 exec/s: 67 rss: 74Mb L: 33/37 MS: 1 ChangeBinInt- 00:08:20.832 [2024-11-18 14:21:16.723010] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ff000002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.832 [2024-11-18 14:21:16.723036] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.832 [2024-11-18 14:21:16.723094] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00ffff00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.832 [2024-11-18 14:21:16.723108] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.832 [2024-11-18 14:21:16.723164] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:000000ff cdw11:ffe7e7e7 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.832 [2024-11-18 14:21:16.723177] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:20.832 [2024-11-18 14:21:16.723233] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:e7ffffff cdw11:00000200 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.833 [2024-11-18 14:21:16.723246] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:20.833 #68 NEW cov: 12467 ft: 14755 corp: 23/669b lim: 40 exec/s: 68 rss: 74Mb L: 37/37 MS: 1 PersAutoDict- DE: "\000\000\002\000"- 00:08:20.833 [2024-11-18 14:21:16.762802] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.833 [2024-11-18 14:21:16.762826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.833 [2024-11-18 14:21:16.762898] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.833 [2024-11-18 14:21:16.762912] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.833 #69 NEW cov: 12467 ft: 14816 corp: 24/689b lim: 40 exec/s: 69 rss: 74Mb L: 20/37 MS: 1 CrossOver- 00:08:20.833 [2024-11-18 14:21:16.823269] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ff000002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.833 [2024-11-18 14:21:16.823294] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.833 [2024-11-18 14:21:16.823352] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00ffff00 cdw11:00bd0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.833 [2024-11-18 14:21:16.823366] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.833 [2024-11-18 14:21:16.823424] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:ffffe7e7 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.833 [2024-11-18 14:21:16.823437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:20.833 [2024-11-18 14:21:16.823493] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:e7e7ffff cdw11:ff000002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.833 [2024-11-18 14:21:16.823506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:20.833 #70 NEW cov: 12467 ft: 14830 corp: 25/727b lim: 40 exec/s: 70 rss: 75Mb L: 38/38 MS: 1 InsertByte- 00:08:20.833 [2024-11-18 14:21:16.883414] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.833 [2024-11-18 14:21:16.883442] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.833 [2024-11-18 14:21:16.883514] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ff000000 cdw11:00ec0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.833 [2024-11-18 14:21:16.883529] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.833 [2024-11-18 14:21:16.883590] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:21e7e7e7 cdw11:e7ffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.833 [2024-11-18 14:21:16.883604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:20.833 [2024-11-18 14:21:16.883672] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:00000200 cdw11:ffff0a68 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.833 [2024-11-18 14:21:16.883685] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:20.833 #71 NEW cov: 12467 ft: 14835 corp: 26/760b lim: 40 exec/s: 71 rss: 75Mb L: 33/38 MS: 1 ChangeByte- 00:08:20.833 [2024-11-18 14:21:16.943449] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.833 [2024-11-18 14:21:16.943473] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.833 [2024-11-18 14:21:16.943531] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.833 [2024-11-18 14:21:16.943545] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.833 [2024-11-18 14:21:16.943610] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.833 [2024-11-18 14:21:16.943624] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:21.093 #72 NEW cov: 12467 ft: 14850 corp: 27/791b lim: 40 exec/s: 72 rss: 75Mb L: 31/38 MS: 1 InsertRepeatedBytes- 00:08:21.093 [2024-11-18 14:21:16.983706] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.093 [2024-11-18 14:21:16.983731] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.093 [2024-11-18 14:21:16.983787] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.093 [2024-11-18 14:21:16.983800] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.093 [2024-11-18 14:21:16.983857] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffe7e6e7 cdw11:e7ffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.093 [2024-11-18 14:21:16.983871] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:21.093 [2024-11-18 14:21:16.983928] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:00000200 cdw11:ffff0a68 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.093 [2024-11-18 14:21:16.983942] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:21.093 #73 NEW cov: 12467 ft: 14872 corp: 28/824b lim: 40 exec/s: 73 rss: 75Mb L: 33/38 MS: 1 ChangeBit- 00:08:21.093 [2024-11-18 14:21:17.023858] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffff25 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.093 [2024-11-18 14:21:17.023886] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.093 [2024-11-18 14:21:17.023961] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.093 [2024-11-18 14:21:17.023974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.093 [2024-11-18 14:21:17.024029] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:0200ffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.093 [2024-11-18 14:21:17.024043] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:21.093 [2024-11-18 14:21:17.024097] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffff0a cdw11:68000002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.094 [2024-11-18 14:21:17.024109] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:21.094 #74 NEW cov: 12467 ft: 14895 corp: 29/858b lim: 40 exec/s: 74 rss: 75Mb L: 34/38 MS: 1 PersAutoDict- DE: "\000\000\002\000"- 00:08:21.094 [2024-11-18 14:21:17.084020] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.094 [2024-11-18 14:21:17.084045] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.094 [2024-11-18 14:21:17.084116] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:ff25ffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.094 [2024-11-18 14:21:17.084129] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.094 [2024-11-18 14:21:17.084187] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.094 [2024-11-18 14:21:17.084200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:21.094 [2024-11-18 14:21:17.084254] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.094 [2024-11-18 14:21:17.084268] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:21.094 #75 NEW cov: 12467 ft: 14902 corp: 30/894b lim: 40 exec/s: 75 rss: 75Mb L: 36/38 MS: 1 InsertRepeatedBytes- 00:08:21.094 [2024-11-18 14:21:17.123718] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.094 [2024-11-18 14:21:17.123742] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.094 #76 NEW cov: 12467 ft: 15573 corp: 31/903b lim: 40 exec/s: 76 rss: 75Mb L: 9/38 MS: 1 CrossOver- 00:08:21.094 [2024-11-18 14:21:17.163978] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:0500ffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.094 [2024-11-18 14:21:17.164003] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.094 [2024-11-18 14:21:17.164074] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.094 [2024-11-18 14:21:17.164088] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.094 #77 NEW cov: 12467 ft: 15604 corp: 32/925b lim: 40 exec/s: 77 rss: 75Mb L: 22/38 MS: 1 PersAutoDict- DE: "\005\000"- 00:08:21.094 [2024-11-18 14:21:17.204220] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.094 [2024-11-18 14:21:17.204244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.094 [2024-11-18 14:21:17.204315] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffe7e7e7 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.094 [2024-11-18 14:21:17.204329] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.094 [2024-11-18 14:21:17.204385] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:e7ffffff cdw11:00000200 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.094 [2024-11-18 14:21:17.204398] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:21.355 #78 NEW cov: 12467 ft: 15631 corp: 33/954b lim: 40 exec/s: 78 rss: 75Mb L: 29/38 MS: 1 ShuffleBytes- 00:08:21.355 [2024-11-18 14:21:17.244340] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.355 [2024-11-18 14:21:17.244365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.355 [2024-11-18 14:21:17.244423] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffe77ee7 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.355 [2024-11-18 14:21:17.244437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.355 [2024-11-18 14:21:17.244491] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:e7ffffff cdw11:00000200 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.355 [2024-11-18 14:21:17.244504] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:21.355 #79 NEW cov: 12467 ft: 15646 corp: 34/983b lim: 40 exec/s: 79 rss: 75Mb L: 29/38 MS: 1 ChangeByte- 00:08:21.355 [2024-11-18 14:21:17.284620] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ff000002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.355 [2024-11-18 14:21:17.284646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.355 [2024-11-18 14:21:17.284704] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00ffff00 cdw11:00bd0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.355 [2024-11-18 14:21:17.284718] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.355 [2024-11-18 14:21:17.284772] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:ffffe7e7 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.355 [2024-11-18 14:21:17.284786] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:21.355 [2024-11-18 14:21:17.284838] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:e7e7ffff cdw11:ff000002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.355 [2024-11-18 14:21:17.284851] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:21.355 #80 NEW cov: 12467 ft: 15678 corp: 35/1021b lim: 40 exec/s: 80 rss: 75Mb L: 38/38 MS: 1 ChangeBinInt- 00:08:21.355 [2024-11-18 14:21:17.344786] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.355 [2024-11-18 14:21:17.344810] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.355 [2024-11-18 14:21:17.344872] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.355 [2024-11-18 14:21:17.344885] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.355 [2024-11-18 14:21:17.344939] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffe7e6e7 cdw11:e7ffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.355 [2024-11-18 14:21:17.344952] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:21.355 [2024-11-18 14:21:17.345007] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:00000200 cdw11:ffff0a68 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.355 [2024-11-18 14:21:17.345020] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:21.355 #81 NEW cov: 12467 ft: 15696 corp: 36/1054b lim: 40 exec/s: 40 rss: 75Mb L: 33/38 MS: 1 ShuffleBytes- 00:08:21.355 #81 DONE cov: 12467 ft: 15696 corp: 36/1054b lim: 40 exec/s: 40 rss: 75Mb 00:08:21.355 ###### Recommended dictionary. ###### 00:08:21.355 "\005\000" # Uses: 3 00:08:21.355 "\000\000\002\000" # Uses: 3 00:08:21.355 ###### End of recommended dictionary. ###### 00:08:21.355 Done 81 runs in 2 second(s) 00:08:21.355 14:21:17 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_11.conf /var/tmp/suppress_nvmf_fuzz 00:08:21.616 14:21:17 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:21.617 14:21:17 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:21.617 14:21:17 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 12 1 0x1 00:08:21.617 14:21:17 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=12 00:08:21.617 14:21:17 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:21.617 14:21:17 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:21.617 14:21:17 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:08:21.617 14:21:17 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_12.conf 00:08:21.617 14:21:17 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:21.617 14:21:17 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:21.617 14:21:17 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 12 00:08:21.617 14:21:17 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4412 00:08:21.617 14:21:17 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:08:21.617 14:21:17 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4412' 00:08:21.617 14:21:17 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4412"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:21.617 14:21:17 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:21.617 14:21:17 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:21.617 14:21:17 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4412' -c /tmp/fuzz_json_12.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 -Z 12 00:08:21.617 [2024-11-18 14:21:17.530026] Starting SPDK v25.01-pre git sha1 d47eb51c9 / DPDK 23.11.0 initialization... 00:08:21.617 [2024-11-18 14:21:17.530091] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid327756 ] 00:08:21.617 [2024-11-18 14:21:17.730606] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:21.617 [2024-11-18 14:21:17.743563] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:21.877 [2024-11-18 14:21:17.796446] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:21.877 [2024-11-18 14:21:17.812781] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4412 *** 00:08:21.877 INFO: Running with entropic power schedule (0xFF, 100). 00:08:21.877 INFO: Seed: 3298881514 00:08:21.877 INFO: Loaded 1 modules (387495 inline 8-bit counters): 387495 [0x2acbe4c, 0x2b2a7f3), 00:08:21.877 INFO: Loaded 1 PC tables (387495 PCs): 387495 [0x2b2a7f8,0x3114268), 00:08:21.877 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:08:21.877 INFO: A corpus is not provided, starting from an empty corpus 00:08:21.877 #2 INITED exec/s: 0 rss: 65Mb 00:08:21.878 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:21.878 This may also happen if the target rejected all inputs we tried so far 00:08:21.878 [2024-11-18 14:21:17.878255] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.878 [2024-11-18 14:21:17.878285] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.138 NEW_FUNC[1/716]: 0x469fe8 in fuzz_admin_directive_send_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:241 00:08:22.138 NEW_FUNC[2/716]: 0x495e28 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:22.138 #18 NEW cov: 12220 ft: 12209 corp: 2/10b lim: 40 exec/s: 0 rss: 72Mb L: 9/9 MS: 1 InsertRepeatedBytes- 00:08:22.138 [2024-11-18 14:21:18.209285] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.138 [2024-11-18 14:21:18.209342] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.138 #19 NEW cov: 12350 ft: 12909 corp: 3/20b lim: 40 exec/s: 0 rss: 72Mb L: 10/10 MS: 1 CopyPart- 00:08:22.399 [2024-11-18 14:21:18.279445] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffdddd SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.399 [2024-11-18 14:21:18.279472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.399 [2024-11-18 14:21:18.279528] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:dddddddd cdw11:dddddddd SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.399 [2024-11-18 14:21:18.279543] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.399 [2024-11-18 14:21:18.279605] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:dddddddd cdw11:dddddddd SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.399 [2024-11-18 14:21:18.279618] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:22.399 #20 NEW cov: 12356 ft: 13826 corp: 4/49b lim: 40 exec/s: 0 rss: 72Mb L: 29/29 MS: 1 InsertRepeatedBytes- 00:08:22.399 [2024-11-18 14:21:18.339585] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ff24ffff cdw11:ffffdddd SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.399 [2024-11-18 14:21:18.339610] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.399 [2024-11-18 14:21:18.339668] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:dddddddd cdw11:dddddddd SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.399 [2024-11-18 14:21:18.339681] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.399 [2024-11-18 14:21:18.339739] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:dddddddd cdw11:dddddddd SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.399 [2024-11-18 14:21:18.339753] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:22.399 #21 NEW cov: 12441 ft: 14113 corp: 5/78b lim: 40 exec/s: 0 rss: 72Mb L: 29/29 MS: 1 ChangeByte- 00:08:22.399 [2024-11-18 14:21:18.399794] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ff24ffff cdw11:ffffdddd SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.399 [2024-11-18 14:21:18.399820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.399 [2024-11-18 14:21:18.399878] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:dddddddd cdw11:dddddddd SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.399 [2024-11-18 14:21:18.399892] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.399 [2024-11-18 14:21:18.399949] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:dddddddd cdw11:dddddddd SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.399 [2024-11-18 14:21:18.399962] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:22.399 #22 NEW cov: 12441 ft: 14160 corp: 6/107b lim: 40 exec/s: 0 rss: 73Mb L: 29/29 MS: 1 ShuffleBytes- 00:08:22.399 [2024-11-18 14:21:18.459641] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:7e7e6060 cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.399 [2024-11-18 14:21:18.459666] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.399 #26 NEW cov: 12441 ft: 14330 corp: 7/118b lim: 40 exec/s: 0 rss: 73Mb L: 11/29 MS: 4 ChangeByte-InsertByte-CopyPart-CrossOver- 00:08:22.399 [2024-11-18 14:21:18.500044] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffdddd SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.399 [2024-11-18 14:21:18.500070] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.399 [2024-11-18 14:21:18.500126] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:dddddddd cdw11:dddddddd SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.399 [2024-11-18 14:21:18.500140] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.399 [2024-11-18 14:21:18.500197] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:dddddddd cdw11:dddddddd SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.399 [2024-11-18 14:21:18.500211] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:22.399 #27 NEW cov: 12441 ft: 14426 corp: 8/147b lim: 40 exec/s: 0 rss: 73Mb L: 29/29 MS: 1 ChangeBinInt- 00:08:22.660 [2024-11-18 14:21:18.540130] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ff24ffff cdw11:ffffdddd SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.660 [2024-11-18 14:21:18.540156] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.660 [2024-11-18 14:21:18.540216] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:dd29dddd cdw11:dddddddd SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.660 [2024-11-18 14:21:18.540230] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.660 [2024-11-18 14:21:18.540286] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:dddddddd cdw11:dddddddd SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.660 [2024-11-18 14:21:18.540303] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:22.660 #28 NEW cov: 12441 ft: 14442 corp: 9/176b lim: 40 exec/s: 0 rss: 73Mb L: 29/29 MS: 1 ChangeByte- 00:08:22.660 [2024-11-18 14:21:18.600489] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.660 [2024-11-18 14:21:18.600513] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.660 [2024-11-18 14:21:18.600590] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.660 [2024-11-18 14:21:18.600604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.660 [2024-11-18 14:21:18.600663] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.660 [2024-11-18 14:21:18.600677] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:22.660 [2024-11-18 14:21:18.600732] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.660 [2024-11-18 14:21:18.600746] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:22.660 #29 NEW cov: 12441 ft: 14787 corp: 10/210b lim: 40 exec/s: 0 rss: 73Mb L: 34/34 MS: 1 InsertRepeatedBytes- 00:08:22.660 [2024-11-18 14:21:18.640389] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ff24ffff cdw11:ffffdddd SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.660 [2024-11-18 14:21:18.640414] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.660 [2024-11-18 14:21:18.640470] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:dddddddd cdw11:dddddddd SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.660 [2024-11-18 14:21:18.640484] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.660 [2024-11-18 14:21:18.640541] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:dddddddd cdw11:dddddddd SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.660 [2024-11-18 14:21:18.640558] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:22.660 #30 NEW cov: 12441 ft: 14824 corp: 11/240b lim: 40 exec/s: 0 rss: 73Mb L: 30/34 MS: 1 InsertByte- 00:08:22.660 [2024-11-18 14:21:18.680378] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:7e7e6060 cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.660 [2024-11-18 14:21:18.680403] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.660 [2024-11-18 14:21:18.680477] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:e2e2e2e2 cdw11:e2e2e2e2 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.660 [2024-11-18 14:21:18.680492] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.660 #31 NEW cov: 12441 ft: 15081 corp: 12/262b lim: 40 exec/s: 0 rss: 73Mb L: 22/34 MS: 1 InsertRepeatedBytes- 00:08:22.660 [2024-11-18 14:21:18.740546] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:7e7e6060 cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.660 [2024-11-18 14:21:18.740575] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.660 [2024-11-18 14:21:18.740647] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:e2e20100 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.660 [2024-11-18 14:21:18.740664] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.661 NEW_FUNC[1/1]: 0x1c52b68 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:22.661 #32 NEW cov: 12464 ft: 15207 corp: 13/284b lim: 40 exec/s: 0 rss: 73Mb L: 22/34 MS: 1 CMP- DE: "\001\000\000\000\000\000\000\000"- 00:08:22.922 [2024-11-18 14:21:18.800743] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffdddd SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.922 [2024-11-18 14:21:18.800769] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.922 [2024-11-18 14:21:18.800827] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:dddddddd cdw11:dddddddd SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.922 [2024-11-18 14:21:18.800841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.922 #33 NEW cov: 12464 ft: 15265 corp: 14/303b lim: 40 exec/s: 0 rss: 73Mb L: 19/34 MS: 1 CrossOver- 00:08:22.922 [2024-11-18 14:21:18.840851] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffdddd SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.922 [2024-11-18 14:21:18.840876] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.922 [2024-11-18 14:21:18.840933] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:01000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.922 [2024-11-18 14:21:18.840946] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.922 #34 NEW cov: 12464 ft: 15336 corp: 15/322b lim: 40 exec/s: 34 rss: 73Mb L: 19/34 MS: 1 PersAutoDict- DE: "\001\000\000\000\000\000\000\000"- 00:08:22.922 [2024-11-18 14:21:18.901331] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ff24ffff cdw11:ffffdddd SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.922 [2024-11-18 14:21:18.901357] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.922 [2024-11-18 14:21:18.901430] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:dddddddd cdw11:dddddddd SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.922 [2024-11-18 14:21:18.901444] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.922 [2024-11-18 14:21:18.901500] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:dddddddd cdw11:dddddddd SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.922 [2024-11-18 14:21:18.901513] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:22.922 [2024-11-18 14:21:18.901569] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ddffffff cdw11:0a60dddd SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.922 [2024-11-18 14:21:18.901583] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:22.922 #35 NEW cov: 12464 ft: 15338 corp: 16/354b lim: 40 exec/s: 35 rss: 73Mb L: 32/34 MS: 1 CopyPart- 00:08:22.922 [2024-11-18 14:21:18.961494] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.922 [2024-11-18 14:21:18.961520] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.922 [2024-11-18 14:21:18.961581] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.922 [2024-11-18 14:21:18.961598] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.922 [2024-11-18 14:21:18.961656] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.922 [2024-11-18 14:21:18.961669] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:22.922 [2024-11-18 14:21:18.961723] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.922 [2024-11-18 14:21:18.961736] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:22.922 #36 NEW cov: 12464 ft: 15355 corp: 17/392b lim: 40 exec/s: 36 rss: 73Mb L: 38/38 MS: 1 InsertRepeatedBytes- 00:08:22.922 [2024-11-18 14:21:19.021370] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ff24ffff cdw11:ffffdddd SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.922 [2024-11-18 14:21:19.021395] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.922 [2024-11-18 14:21:19.021469] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:dddddddd cdw11:dddddddd SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.922 [2024-11-18 14:21:19.021483] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.922 #37 NEW cov: 12464 ft: 15366 corp: 18/412b lim: 40 exec/s: 37 rss: 73Mb L: 20/38 MS: 1 EraseBytes- 00:08:23.195 [2024-11-18 14:21:19.061446] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:7e7e6060 cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.195 [2024-11-18 14:21:19.061472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.195 [2024-11-18 14:21:19.061546] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:e2e20100 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.195 [2024-11-18 14:21:19.061567] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:23.195 #38 NEW cov: 12464 ft: 15373 corp: 19/434b lim: 40 exec/s: 38 rss: 73Mb L: 22/38 MS: 1 CopyPart- 00:08:23.195 [2024-11-18 14:21:19.121809] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffdddd SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.195 [2024-11-18 14:21:19.121835] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.195 [2024-11-18 14:21:19.121894] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:dddddddd cdw11:dddddddd SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.195 [2024-11-18 14:21:19.121908] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:23.195 [2024-11-18 14:21:19.121964] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:dddddddd cdw11:ffffff0a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.195 [2024-11-18 14:21:19.121978] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:23.195 #39 NEW cov: 12464 ft: 15380 corp: 20/458b lim: 40 exec/s: 39 rss: 74Mb L: 24/38 MS: 1 EraseBytes- 00:08:23.195 [2024-11-18 14:21:19.161908] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:7e7e6060 cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.195 [2024-11-18 14:21:19.161933] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.195 [2024-11-18 14:21:19.162015] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:e2e20100 cdw11:0000007e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.195 [2024-11-18 14:21:19.162030] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:23.195 [2024-11-18 14:21:19.162088] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:7e6060ff cdw11:000000e2 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.195 [2024-11-18 14:21:19.162101] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:23.195 #40 NEW cov: 12464 ft: 15411 corp: 21/485b lim: 40 exec/s: 40 rss: 74Mb L: 27/38 MS: 1 CrossOver- 00:08:23.195 [2024-11-18 14:21:19.201989] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:7e7e6060 cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.195 [2024-11-18 14:21:19.202015] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.195 [2024-11-18 14:21:19.202074] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:e2c50100 cdw11:0000007e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.195 [2024-11-18 14:21:19.202088] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:23.195 [2024-11-18 14:21:19.202144] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:7e6060ff cdw11:000000e2 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.195 [2024-11-18 14:21:19.202158] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:23.195 #41 NEW cov: 12464 ft: 15428 corp: 22/512b lim: 40 exec/s: 41 rss: 74Mb L: 27/38 MS: 1 ChangeByte- 00:08:23.195 [2024-11-18 14:21:19.262235] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffdddd SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.195 [2024-11-18 14:21:19.262261] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.195 [2024-11-18 14:21:19.262336] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:dfdddddd cdw11:dddddddd SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.195 [2024-11-18 14:21:19.262351] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:23.195 [2024-11-18 14:21:19.262408] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:dddddddd cdw11:ffffff0a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.195 [2024-11-18 14:21:19.262421] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:23.195 #42 NEW cov: 12464 ft: 15439 corp: 23/536b lim: 40 exec/s: 42 rss: 74Mb L: 24/38 MS: 1 ChangeBinInt- 00:08:23.458 [2024-11-18 14:21:19.322539] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.458 [2024-11-18 14:21:19.322571] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.458 [2024-11-18 14:21:19.322644] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.458 [2024-11-18 14:21:19.322658] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:23.458 [2024-11-18 14:21:19.322715] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:fff7ffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.458 [2024-11-18 14:21:19.322740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:23.458 [2024-11-18 14:21:19.322802] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.458 [2024-11-18 14:21:19.322815] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:23.458 #43 NEW cov: 12464 ft: 15495 corp: 24/570b lim: 40 exec/s: 43 rss: 74Mb L: 34/38 MS: 1 ChangeBit- 00:08:23.458 [2024-11-18 14:21:19.362592] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.458 [2024-11-18 14:21:19.362618] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.458 [2024-11-18 14:21:19.362676] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ff25ffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.458 [2024-11-18 14:21:19.362690] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:23.458 [2024-11-18 14:21:19.362746] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.458 [2024-11-18 14:21:19.362760] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:23.458 [2024-11-18 14:21:19.362814] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.458 [2024-11-18 14:21:19.362827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:23.458 #44 NEW cov: 12464 ft: 15509 corp: 25/604b lim: 40 exec/s: 44 rss: 74Mb L: 34/38 MS: 1 ChangeByte- 00:08:23.458 [2024-11-18 14:21:19.402702] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:7e7e6060 cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.458 [2024-11-18 14:21:19.402727] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.458 [2024-11-18 14:21:19.402803] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:e2ffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.458 [2024-11-18 14:21:19.402818] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:23.458 [2024-11-18 14:21:19.402874] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:0ae20100 cdw11:0000007e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.458 [2024-11-18 14:21:19.402888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:23.458 [2024-11-18 14:21:19.402946] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:7e6060ff cdw11:000000e2 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.458 [2024-11-18 14:21:19.402960] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:23.458 #45 NEW cov: 12464 ft: 15524 corp: 26/639b lim: 40 exec/s: 45 rss: 74Mb L: 35/38 MS: 1 CrossOver- 00:08:23.458 [2024-11-18 14:21:19.442495] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:7e7e6060 cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.458 [2024-11-18 14:21:19.442520] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.458 [2024-11-18 14:21:19.442599] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:e2e20100 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.458 [2024-11-18 14:21:19.442613] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:23.458 #46 NEW cov: 12464 ft: 15534 corp: 27/661b lim: 40 exec/s: 46 rss: 74Mb L: 22/38 MS: 1 ShuffleBytes- 00:08:23.458 [2024-11-18 14:21:19.502844] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffdddd SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.458 [2024-11-18 14:21:19.502870] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.458 [2024-11-18 14:21:19.502928] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:dfdd3bdd cdw11:dddddddd SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.458 [2024-11-18 14:21:19.502942] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:23.458 [2024-11-18 14:21:19.503000] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:dddddddd cdw11:ffffff0a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.458 [2024-11-18 14:21:19.503014] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:23.458 #47 NEW cov: 12464 ft: 15539 corp: 28/685b lim: 40 exec/s: 47 rss: 74Mb L: 24/38 MS: 1 ChangeByte- 00:08:23.458 [2024-11-18 14:21:19.562703] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:7e7e6060 cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.458 [2024-11-18 14:21:19.562730] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.720 #48 NEW cov: 12464 ft: 15594 corp: 29/700b lim: 40 exec/s: 48 rss: 74Mb L: 15/38 MS: 1 CopyPart- 00:08:23.720 [2024-11-18 14:21:19.602948] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ff24ffff cdw11:ffffdddd SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.720 [2024-11-18 14:21:19.602974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.720 [2024-11-18 14:21:19.603031] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:dddddddd cdw11:dddddddd SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.720 [2024-11-18 14:21:19.603045] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:23.720 #49 NEW cov: 12464 ft: 15604 corp: 30/719b lim: 40 exec/s: 49 rss: 74Mb L: 19/38 MS: 1 EraseBytes- 00:08:23.720 [2024-11-18 14:21:19.663115] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ff24ffff cdw11:ffffdddd SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.720 [2024-11-18 14:21:19.663139] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.720 [2024-11-18 14:21:19.663213] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:dddddddd cdw11:ddddddff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.720 [2024-11-18 14:21:19.663227] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:23.720 #50 NEW cov: 12464 ft: 15623 corp: 31/736b lim: 40 exec/s: 50 rss: 74Mb L: 17/38 MS: 1 EraseBytes- 00:08:23.720 [2024-11-18 14:21:19.703253] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffdddd SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.720 [2024-11-18 14:21:19.703278] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.720 [2024-11-18 14:21:19.703337] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:dddddddd cdw11:dddddddd SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.720 [2024-11-18 14:21:19.703351] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:23.720 [2024-11-18 14:21:19.743421] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffdddd SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.720 [2024-11-18 14:21:19.743449] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.720 [2024-11-18 14:21:19.743523] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:dddddddd cdw11:dddddddd SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.720 [2024-11-18 14:21:19.743537] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:23.720 #52 NEW cov: 12464 ft: 15651 corp: 32/756b lim: 40 exec/s: 52 rss: 74Mb L: 20/38 MS: 2 InsertByte-ChangeByte- 00:08:23.720 [2024-11-18 14:21:19.783635] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ff24ffff cdw11:ffff65dd SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.720 [2024-11-18 14:21:19.783660] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.720 [2024-11-18 14:21:19.783737] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:dddddddd cdw11:dddddddd SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.720 [2024-11-18 14:21:19.783751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:23.720 [2024-11-18 14:21:19.783809] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:dddddddd cdw11:dddddddd SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.720 [2024-11-18 14:21:19.783822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:23.720 #53 NEW cov: 12464 ft: 15658 corp: 33/786b lim: 40 exec/s: 53 rss: 74Mb L: 30/38 MS: 1 InsertByte- 00:08:23.720 [2024-11-18 14:21:19.823445] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ffdddddd cdw11:dddddddd SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.720 [2024-11-18 14:21:19.823470] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.720 #54 NEW cov: 12464 ft: 15662 corp: 34/801b lim: 40 exec/s: 54 rss: 74Mb L: 15/38 MS: 1 EraseBytes- 00:08:23.982 [2024-11-18 14:21:19.864065] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.982 [2024-11-18 14:21:19.864091] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.982 [2024-11-18 14:21:19.864151] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ff25ffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.982 [2024-11-18 14:21:19.864165] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:23.982 [2024-11-18 14:21:19.864237] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.982 [2024-11-18 14:21:19.864251] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:23.982 [2024-11-18 14:21:19.864307] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.982 [2024-11-18 14:21:19.864320] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:23.982 #55 NEW cov: 12464 ft: 15665 corp: 35/835b lim: 40 exec/s: 27 rss: 74Mb L: 34/38 MS: 1 ShuffleBytes- 00:08:23.982 #55 DONE cov: 12464 ft: 15665 corp: 35/835b lim: 40 exec/s: 27 rss: 74Mb 00:08:23.982 ###### Recommended dictionary. ###### 00:08:23.982 "\001\000\000\000\000\000\000\000" # Uses: 1 00:08:23.982 ###### End of recommended dictionary. ###### 00:08:23.982 Done 55 runs in 2 second(s) 00:08:23.982 14:21:19 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_12.conf /var/tmp/suppress_nvmf_fuzz 00:08:23.982 14:21:20 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:23.982 14:21:20 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:23.982 14:21:20 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 13 1 0x1 00:08:23.982 14:21:20 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=13 00:08:23.982 14:21:20 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:23.982 14:21:20 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:23.982 14:21:20 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:08:23.982 14:21:20 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_13.conf 00:08:23.982 14:21:20 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:23.982 14:21:20 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:23.982 14:21:20 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 13 00:08:23.982 14:21:20 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4413 00:08:23.982 14:21:20 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:08:23.982 14:21:20 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4413' 00:08:23.982 14:21:20 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4413"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:23.982 14:21:20 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:23.982 14:21:20 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:23.982 14:21:20 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4413' -c /tmp/fuzz_json_13.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 -Z 13 00:08:23.982 [2024-11-18 14:21:20.052134] Starting SPDK v25.01-pre git sha1 d47eb51c9 / DPDK 23.11.0 initialization... 00:08:23.982 [2024-11-18 14:21:20.052225] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid328291 ] 00:08:24.243 [2024-11-18 14:21:20.324896] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:24.243 [2024-11-18 14:21:20.347365] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:24.503 [2024-11-18 14:21:20.400190] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:24.503 [2024-11-18 14:21:20.416518] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4413 *** 00:08:24.503 INFO: Running with entropic power schedule (0xFF, 100). 00:08:24.503 INFO: Seed: 1605889096 00:08:24.504 INFO: Loaded 1 modules (387495 inline 8-bit counters): 387495 [0x2acbe4c, 0x2b2a7f3), 00:08:24.504 INFO: Loaded 1 PC tables (387495 PCs): 387495 [0x2b2a7f8,0x3114268), 00:08:24.504 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:08:24.504 INFO: A corpus is not provided, starting from an empty corpus 00:08:24.504 #2 INITED exec/s: 0 rss: 65Mb 00:08:24.504 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:24.504 This may also happen if the target rejected all inputs we tried so far 00:08:24.504 [2024-11-18 14:21:20.475737] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:24.504 [2024-11-18 14:21:20.475765] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.504 [2024-11-18 14:21:20.475826] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:24.504 [2024-11-18 14:21:20.475844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:24.504 [2024-11-18 14:21:20.475903] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:24.504 [2024-11-18 14:21:20.475917] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:24.504 [2024-11-18 14:21:20.475977] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:24.504 [2024-11-18 14:21:20.475991] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:24.504 [2024-11-18 14:21:20.476048] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:8 nsid:0 cdw10:ffffffff cdw11:ffffff0a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:24.504 [2024-11-18 14:21:20.476062] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:24.764 NEW_FUNC[1/715]: 0x46bbb8 in fuzz_admin_directive_receive_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:257 00:08:24.764 NEW_FUNC[2/715]: 0x495e28 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:24.764 #3 NEW cov: 12226 ft: 12227 corp: 2/41b lim: 40 exec/s: 0 rss: 72Mb L: 40/40 MS: 1 InsertRepeatedBytes- 00:08:24.764 [2024-11-18 14:21:20.816608] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:24.764 [2024-11-18 14:21:20.816663] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.764 [2024-11-18 14:21:20.816752] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:24.764 [2024-11-18 14:21:20.816780] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:24.764 [2024-11-18 14:21:20.816864] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff1a0a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:24.764 [2024-11-18 14:21:20.816890] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:24.764 #7 NEW cov: 12339 ft: 13422 corp: 3/67b lim: 40 exec/s: 0 rss: 72Mb L: 26/40 MS: 4 CopyPart-ChangeBit-CopyPart-InsertRepeatedBytes- 00:08:24.764 [2024-11-18 14:21:20.866688] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:24.764 [2024-11-18 14:21:20.866715] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.764 [2024-11-18 14:21:20.866775] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:24.765 [2024-11-18 14:21:20.866789] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:24.765 [2024-11-18 14:21:20.866848] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:24.765 [2024-11-18 14:21:20.866861] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:24.765 [2024-11-18 14:21:20.866919] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:24.765 [2024-11-18 14:21:20.866936] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:24.765 [2024-11-18 14:21:20.866994] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:8 nsid:0 cdw10:3bffffff cdw11:ffffff0a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:24.765 [2024-11-18 14:21:20.867007] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:25.025 #8 NEW cov: 12345 ft: 13617 corp: 4/107b lim: 40 exec/s: 0 rss: 72Mb L: 40/40 MS: 1 ChangeByte- 00:08:25.025 [2024-11-18 14:21:20.926598] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.025 [2024-11-18 14:21:20.926625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.025 [2024-11-18 14:21:20.926685] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.025 [2024-11-18 14:21:20.926699] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.025 [2024-11-18 14:21:20.926757] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff1a0a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.025 [2024-11-18 14:21:20.926771] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:25.025 #9 NEW cov: 12430 ft: 13986 corp: 5/133b lim: 40 exec/s: 0 rss: 72Mb L: 26/40 MS: 1 ShuffleBytes- 00:08:25.025 [2024-11-18 14:21:20.986766] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.025 [2024-11-18 14:21:20.986792] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.025 [2024-11-18 14:21:20.986852] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ff010000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.025 [2024-11-18 14:21:20.986866] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.025 [2024-11-18 14:21:20.986925] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00ffffff cdw11:ffff1a0a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.025 [2024-11-18 14:21:20.986939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:25.025 #10 NEW cov: 12430 ft: 14050 corp: 6/159b lim: 40 exec/s: 0 rss: 72Mb L: 26/40 MS: 1 ChangeBinInt- 00:08:25.025 [2024-11-18 14:21:21.046933] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.025 [2024-11-18 14:21:21.046959] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.025 [2024-11-18 14:21:21.047019] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ff010000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.025 [2024-11-18 14:21:21.047033] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.025 [2024-11-18 14:21:21.047091] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:001a0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.025 [2024-11-18 14:21:21.047105] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:25.025 #16 NEW cov: 12430 ft: 14167 corp: 7/185b lim: 40 exec/s: 0 rss: 73Mb L: 26/40 MS: 1 ChangeBinInt- 00:08:25.025 [2024-11-18 14:21:21.107364] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffff28 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.025 [2024-11-18 14:21:21.107389] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.025 [2024-11-18 14:21:21.107466] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:000000ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.025 [2024-11-18 14:21:21.107480] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.025 [2024-11-18 14:21:21.107540] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.026 [2024-11-18 14:21:21.107558] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:25.026 [2024-11-18 14:21:21.107617] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.026 [2024-11-18 14:21:21.107631] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:25.026 [2024-11-18 14:21:21.107687] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:8 nsid:0 cdw10:ffffffff cdw11:ffffff0a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.026 [2024-11-18 14:21:21.107700] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:25.026 #17 NEW cov: 12430 ft: 14289 corp: 8/225b lim: 40 exec/s: 0 rss: 73Mb L: 40/40 MS: 1 ChangeBinInt- 00:08:25.026 [2024-11-18 14:21:21.147245] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.026 [2024-11-18 14:21:21.147270] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.026 [2024-11-18 14:21:21.147334] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.026 [2024-11-18 14:21:21.147348] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.026 [2024-11-18 14:21:21.147411] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffe4f5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.026 [2024-11-18 14:21:21.147424] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:25.287 #18 NEW cov: 12430 ft: 14323 corp: 9/251b lim: 40 exec/s: 0 rss: 73Mb L: 26/40 MS: 1 ChangeBinInt- 00:08:25.287 [2024-11-18 14:21:21.187558] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.287 [2024-11-18 14:21:21.187584] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.287 [2024-11-18 14:21:21.187644] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.287 [2024-11-18 14:21:21.187658] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.287 [2024-11-18 14:21:21.187716] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.287 [2024-11-18 14:21:21.187729] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:25.287 [2024-11-18 14:21:21.187791] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.287 [2024-11-18 14:21:21.187804] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:25.287 [2024-11-18 14:21:21.187864] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:8 nsid:0 cdw10:ffffffff cdw11:fdffff0a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.287 [2024-11-18 14:21:21.187877] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:25.287 #19 NEW cov: 12430 ft: 14409 corp: 10/291b lim: 40 exec/s: 0 rss: 73Mb L: 40/40 MS: 1 ChangeBit- 00:08:25.287 [2024-11-18 14:21:21.227434] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.287 [2024-11-18 14:21:21.227459] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.287 [2024-11-18 14:21:21.227534] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffdf SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.287 [2024-11-18 14:21:21.227554] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.287 [2024-11-18 14:21:21.227613] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff1a0a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.287 [2024-11-18 14:21:21.227627] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:25.287 #20 NEW cov: 12430 ft: 14461 corp: 11/317b lim: 40 exec/s: 0 rss: 73Mb L: 26/40 MS: 1 ChangeBit- 00:08:25.287 [2024-11-18 14:21:21.267786] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.287 [2024-11-18 14:21:21.267811] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.287 [2024-11-18 14:21:21.267888] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.287 [2024-11-18 14:21:21.267902] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.287 [2024-11-18 14:21:21.267961] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.287 [2024-11-18 14:21:21.267974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:25.287 [2024-11-18 14:21:21.268032] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffff3fff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.287 [2024-11-18 14:21:21.268046] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:25.287 [2024-11-18 14:21:21.268104] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:8 nsid:0 cdw10:ffffffff cdw11:ffffff0a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.287 [2024-11-18 14:21:21.268118] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:25.287 #21 NEW cov: 12430 ft: 14498 corp: 12/357b lim: 40 exec/s: 0 rss: 73Mb L: 40/40 MS: 1 ChangeByte- 00:08:25.287 [2024-11-18 14:21:21.307880] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.287 [2024-11-18 14:21:21.307904] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.287 [2024-11-18 14:21:21.307969] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffff0000 cdw11:00f8ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.287 [2024-11-18 14:21:21.307983] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.287 [2024-11-18 14:21:21.308046] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.287 [2024-11-18 14:21:21.308059] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:25.287 [2024-11-18 14:21:21.308118] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.287 [2024-11-18 14:21:21.308131] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:25.287 [2024-11-18 14:21:21.308189] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:8 nsid:0 cdw10:3bffffff cdw11:ffffff0a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.287 [2024-11-18 14:21:21.308202] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:25.287 #22 NEW cov: 12430 ft: 14536 corp: 13/397b lim: 40 exec/s: 0 rss: 73Mb L: 40/40 MS: 1 ChangeBinInt- 00:08:25.287 [2024-11-18 14:21:21.368118] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffff28 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.287 [2024-11-18 14:21:21.368143] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.287 [2024-11-18 14:21:21.368203] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:000000ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.287 [2024-11-18 14:21:21.368217] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.287 [2024-11-18 14:21:21.368290] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.287 [2024-11-18 14:21:21.368304] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:25.287 [2024-11-18 14:21:21.368363] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.287 [2024-11-18 14:21:21.368376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:25.288 [2024-11-18 14:21:21.368435] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:8 nsid:0 cdw10:ffffffff cdw11:ff3bff0a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.288 [2024-11-18 14:21:21.368449] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:25.288 NEW_FUNC[1/1]: 0x1c52b68 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:25.288 #23 NEW cov: 12453 ft: 14573 corp: 14/437b lim: 40 exec/s: 0 rss: 73Mb L: 40/40 MS: 1 ChangeByte- 00:08:25.549 [2024-11-18 14:21:21.428123] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffff13 cdw11:13131313 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.549 [2024-11-18 14:21:21.428148] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.549 [2024-11-18 14:21:21.428224] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:13131313 cdw11:131313ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.549 [2024-11-18 14:21:21.428242] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.549 [2024-11-18 14:21:21.428303] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.549 [2024-11-18 14:21:21.428317] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:25.549 [2024-11-18 14:21:21.428375] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ff010000 cdw11:00ffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.549 [2024-11-18 14:21:21.428389] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:25.549 #24 NEW cov: 12453 ft: 14600 corp: 15/475b lim: 40 exec/s: 0 rss: 73Mb L: 38/40 MS: 1 InsertRepeatedBytes- 00:08:25.549 [2024-11-18 14:21:21.468089] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.549 [2024-11-18 14:21:21.468114] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.549 [2024-11-18 14:21:21.468175] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.549 [2024-11-18 14:21:21.468189] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.549 [2024-11-18 14:21:21.468247] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.549 [2024-11-18 14:21:21.468261] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:25.549 #25 NEW cov: 12453 ft: 14633 corp: 16/504b lim: 40 exec/s: 25 rss: 73Mb L: 29/40 MS: 1 EraseBytes- 00:08:25.549 [2024-11-18 14:21:21.508207] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff2eff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.549 [2024-11-18 14:21:21.508231] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.549 [2024-11-18 14:21:21.508311] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.549 [2024-11-18 14:21:21.508326] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.549 [2024-11-18 14:21:21.508385] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:dfffffff cdw11:ffffff1a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.549 [2024-11-18 14:21:21.508399] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:25.549 #26 NEW cov: 12453 ft: 14659 corp: 17/531b lim: 40 exec/s: 26 rss: 73Mb L: 27/40 MS: 1 InsertByte- 00:08:25.549 [2024-11-18 14:21:21.568379] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffff9c cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.549 [2024-11-18 14:21:21.568403] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.549 [2024-11-18 14:21:21.568481] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.549 [2024-11-18 14:21:21.568495] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.549 [2024-11-18 14:21:21.568559] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffe4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.549 [2024-11-18 14:21:21.568577] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:25.549 #27 NEW cov: 12453 ft: 14690 corp: 18/558b lim: 40 exec/s: 27 rss: 73Mb L: 27/40 MS: 1 InsertByte- 00:08:25.549 [2024-11-18 14:21:21.628417] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.549 [2024-11-18 14:21:21.628441] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.549 [2024-11-18 14:21:21.628519] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.549 [2024-11-18 14:21:21.628533] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.549 #28 NEW cov: 12453 ft: 14940 corp: 19/580b lim: 40 exec/s: 28 rss: 73Mb L: 22/40 MS: 1 EraseBytes- 00:08:25.549 [2024-11-18 14:21:21.668908] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffff28 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.549 [2024-11-18 14:21:21.668933] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.549 [2024-11-18 14:21:21.669011] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:000000ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.549 [2024-11-18 14:21:21.669025] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.549 [2024-11-18 14:21:21.669086] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.549 [2024-11-18 14:21:21.669100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:25.549 [2024-11-18 14:21:21.669159] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.549 [2024-11-18 14:21:21.669173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:25.549 [2024-11-18 14:21:21.669232] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:8 nsid:0 cdw10:ffffffff cdw11:ff3bff0a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.549 [2024-11-18 14:21:21.669245] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:25.810 #29 NEW cov: 12453 ft: 14964 corp: 20/620b lim: 40 exec/s: 29 rss: 73Mb L: 40/40 MS: 1 ShuffleBytes- 00:08:25.810 [2024-11-18 14:21:21.728858] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.810 [2024-11-18 14:21:21.728883] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.810 [2024-11-18 14:21:21.728960] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.810 [2024-11-18 14:21:21.728974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.810 [2024-11-18 14:21:21.729033] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:fdffff0a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.810 [2024-11-18 14:21:21.729046] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:25.810 #30 NEW cov: 12453 ft: 15032 corp: 21/644b lim: 40 exec/s: 30 rss: 73Mb L: 24/40 MS: 1 EraseBytes- 00:08:25.810 [2024-11-18 14:21:21.789161] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ff8515a6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.810 [2024-11-18 14:21:21.789185] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.810 [2024-11-18 14:21:21.789262] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:5eb38b8b cdw11:00ffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.810 [2024-11-18 14:21:21.789277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.810 [2024-11-18 14:21:21.789338] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.810 [2024-11-18 14:21:21.789352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:25.811 [2024-11-18 14:21:21.789413] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff1a0a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.811 [2024-11-18 14:21:21.789426] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:25.811 #31 NEW cov: 12453 ft: 15034 corp: 22/678b lim: 40 exec/s: 31 rss: 73Mb L: 34/40 MS: 1 CMP- DE: "\205\025\246^\263\213\213\000"- 00:08:25.811 [2024-11-18 14:21:21.829109] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.811 [2024-11-18 14:21:21.829134] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.811 [2024-11-18 14:21:21.829195] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ff014000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.811 [2024-11-18 14:21:21.829208] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.811 [2024-11-18 14:21:21.829269] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:001a0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.811 [2024-11-18 14:21:21.829282] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:25.811 #32 NEW cov: 12453 ft: 15076 corp: 23/704b lim: 40 exec/s: 32 rss: 73Mb L: 26/40 MS: 1 ChangeBit- 00:08:25.811 [2024-11-18 14:21:21.889303] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.811 [2024-11-18 14:21:21.889329] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.811 [2024-11-18 14:21:21.889408] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffdf SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.811 [2024-11-18 14:21:21.889423] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.811 [2024-11-18 14:21:21.889486] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffefffff cdw11:ffff1a0a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.811 [2024-11-18 14:21:21.889500] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:25.811 #33 NEW cov: 12453 ft: 15085 corp: 24/730b lim: 40 exec/s: 33 rss: 74Mb L: 26/40 MS: 1 ChangeBit- 00:08:25.811 [2024-11-18 14:21:21.929655] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.811 [2024-11-18 14:21:21.929683] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.811 [2024-11-18 14:21:21.929761] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ff848484 cdw11:84848484 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.811 [2024-11-18 14:21:21.929775] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.811 [2024-11-18 14:21:21.929836] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:84848484 cdw11:848484ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.811 [2024-11-18 14:21:21.929850] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:25.811 [2024-11-18 14:21:21.929910] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffdfffef SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.811 [2024-11-18 14:21:21.929923] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:25.811 [2024-11-18 14:21:21.929982] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:8 nsid:0 cdw10:ffffffff cdw11:1a0a1a0a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.811 [2024-11-18 14:21:21.929995] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:26.071 #34 NEW cov: 12453 ft: 15123 corp: 25/770b lim: 40 exec/s: 34 rss: 74Mb L: 40/40 MS: 1 InsertRepeatedBytes- 00:08:26.072 [2024-11-18 14:21:21.989532] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.072 [2024-11-18 14:21:21.989561] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.072 [2024-11-18 14:21:21.989640] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffdfffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.072 [2024-11-18 14:21:21.989654] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.072 [2024-11-18 14:21:21.989718] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:1a0a1a0a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.072 [2024-11-18 14:21:21.989732] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:26.072 #35 NEW cov: 12453 ft: 15144 corp: 26/794b lim: 40 exec/s: 35 rss: 74Mb L: 24/40 MS: 1 EraseBytes- 00:08:26.072 [2024-11-18 14:21:22.029946] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.072 [2024-11-18 14:21:22.029972] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.072 [2024-11-18 14:21:22.030033] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.072 [2024-11-18 14:21:22.030047] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.072 [2024-11-18 14:21:22.030108] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.072 [2024-11-18 14:21:22.030121] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:26.072 [2024-11-18 14:21:22.030182] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.072 [2024-11-18 14:21:22.030199] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:26.072 [2024-11-18 14:21:22.030260] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:8 nsid:0 cdw10:3bffffff cdw11:ffffff0a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.072 [2024-11-18 14:21:22.030273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:26.072 #36 NEW cov: 12453 ft: 15150 corp: 27/834b lim: 40 exec/s: 36 rss: 74Mb L: 40/40 MS: 1 ShuffleBytes- 00:08:26.072 [2024-11-18 14:21:22.069769] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:8515a65e cdw11:b38b8b00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.072 [2024-11-18 14:21:22.069794] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.072 [2024-11-18 14:21:22.069854] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ff010000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.072 [2024-11-18 14:21:22.069868] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.072 [2024-11-18 14:21:22.069928] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:001a0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.072 [2024-11-18 14:21:22.069941] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:26.072 #37 NEW cov: 12453 ft: 15176 corp: 28/860b lim: 40 exec/s: 37 rss: 74Mb L: 26/40 MS: 1 PersAutoDict- DE: "\205\025\246^\263\213\213\000"- 00:08:26.072 [2024-11-18 14:21:22.109873] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff2eff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.072 [2024-11-18 14:21:22.109897] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.072 [2024-11-18 14:21:22.109973] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.072 [2024-11-18 14:21:22.109987] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.072 [2024-11-18 14:21:22.110048] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff1a0a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.072 [2024-11-18 14:21:22.110062] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:26.072 #38 NEW cov: 12453 ft: 15204 corp: 29/886b lim: 40 exec/s: 38 rss: 74Mb L: 26/40 MS: 1 EraseBytes- 00:08:26.072 [2024-11-18 14:21:22.170188] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.072 [2024-11-18 14:21:22.170212] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.072 [2024-11-18 14:21:22.170290] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffdf SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.072 [2024-11-18 14:21:22.170305] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.072 [2024-11-18 14:21:22.170365] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff1a0a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.072 [2024-11-18 14:21:22.170379] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:26.072 [2024-11-18 14:21:22.170438] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.072 [2024-11-18 14:21:22.170457] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:26.072 #39 NEW cov: 12453 ft: 15218 corp: 30/920b lim: 40 exec/s: 39 rss: 74Mb L: 34/40 MS: 1 InsertRepeatedBytes- 00:08:26.333 [2024-11-18 14:21:22.209912] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.333 [2024-11-18 14:21:22.209937] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.333 #40 NEW cov: 12453 ft: 15529 corp: 31/933b lim: 40 exec/s: 40 rss: 74Mb L: 13/40 MS: 1 CrossOver- 00:08:26.333 [2024-11-18 14:21:22.250614] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffff13 cdw11:131313ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.333 [2024-11-18 14:21:22.250639] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.333 [2024-11-18 14:21:22.250699] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ff131313 cdw11:13131313 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.333 [2024-11-18 14:21:22.250713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.333 [2024-11-18 14:21:22.250773] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:13ffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.333 [2024-11-18 14:21:22.250787] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:26.333 [2024-11-18 14:21:22.250846] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffff01 cdw11:000000ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.333 [2024-11-18 14:21:22.250860] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:26.333 [2024-11-18 14:21:22.250917] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:8 nsid:0 cdw10:ffffffff cdw11:1a0a1a0a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.333 [2024-11-18 14:21:22.250930] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:26.333 #41 NEW cov: 12453 ft: 15549 corp: 32/973b lim: 40 exec/s: 41 rss: 74Mb L: 40/40 MS: 1 CopyPart- 00:08:26.333 [2024-11-18 14:21:22.310718] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.333 [2024-11-18 14:21:22.310743] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.333 [2024-11-18 14:21:22.310807] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.333 [2024-11-18 14:21:22.310821] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.333 [2024-11-18 14:21:22.310879] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.333 [2024-11-18 14:21:22.310892] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:26.333 [2024-11-18 14:21:22.310953] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.333 [2024-11-18 14:21:22.310967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:26.333 [2024-11-18 14:21:22.311029] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:8 nsid:0 cdw10:3bffffff cdw11:ffffff0a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.333 [2024-11-18 14:21:22.311043] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:26.333 #42 NEW cov: 12453 ft: 15556 corp: 33/1013b lim: 40 exec/s: 42 rss: 74Mb L: 40/40 MS: 1 CopyPart- 00:08:26.333 [2024-11-18 14:21:22.370608] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:fff7ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.333 [2024-11-18 14:21:22.370633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.333 [2024-11-18 14:21:22.370722] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ff010000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.333 [2024-11-18 14:21:22.370737] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.333 [2024-11-18 14:21:22.370799] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00ffffff cdw11:ffff1a0a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.333 [2024-11-18 14:21:22.370812] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:26.333 #43 NEW cov: 12453 ft: 15571 corp: 34/1039b lim: 40 exec/s: 43 rss: 74Mb L: 26/40 MS: 1 ChangeBit- 00:08:26.333 [2024-11-18 14:21:22.410744] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.333 [2024-11-18 14:21:22.410769] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.333 [2024-11-18 14:21:22.410844] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffdf SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.333 [2024-11-18 14:21:22.410859] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.333 [2024-11-18 14:21:22.410918] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffefffff cdw11:ffff1ae8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.333 [2024-11-18 14:21:22.410932] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:26.333 #44 NEW cov: 12453 ft: 15590 corp: 35/1068b lim: 40 exec/s: 44 rss: 74Mb L: 29/40 MS: 1 InsertRepeatedBytes- 00:08:26.333 [2024-11-18 14:21:22.450795] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.333 [2024-11-18 14:21:22.450820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.333 [2024-11-18 14:21:22.450883] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.333 [2024-11-18 14:21:22.450897] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.333 [2024-11-18 14:21:22.450975] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffefffff cdw11:ffff1ae8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.333 [2024-11-18 14:21:22.450988] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:26.593 #45 NEW cov: 12453 ft: 15614 corp: 36/1097b lim: 40 exec/s: 22 rss: 74Mb L: 29/40 MS: 1 CrossOver- 00:08:26.593 #45 DONE cov: 12453 ft: 15614 corp: 36/1097b lim: 40 exec/s: 22 rss: 74Mb 00:08:26.593 ###### Recommended dictionary. ###### 00:08:26.593 "\205\025\246^\263\213\213\000" # Uses: 1 00:08:26.593 ###### End of recommended dictionary. ###### 00:08:26.593 Done 45 runs in 2 second(s) 00:08:26.593 14:21:22 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_13.conf /var/tmp/suppress_nvmf_fuzz 00:08:26.593 14:21:22 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:26.593 14:21:22 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:26.593 14:21:22 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 14 1 0x1 00:08:26.593 14:21:22 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=14 00:08:26.593 14:21:22 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:26.593 14:21:22 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:26.593 14:21:22 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:08:26.593 14:21:22 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_14.conf 00:08:26.593 14:21:22 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:26.593 14:21:22 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:26.593 14:21:22 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 14 00:08:26.593 14:21:22 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4414 00:08:26.593 14:21:22 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:08:26.593 14:21:22 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4414' 00:08:26.593 14:21:22 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4414"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:26.593 14:21:22 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:26.593 14:21:22 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:26.593 14:21:22 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4414' -c /tmp/fuzz_json_14.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 -Z 14 00:08:26.593 [2024-11-18 14:21:22.636895] Starting SPDK v25.01-pre git sha1 d47eb51c9 / DPDK 23.11.0 initialization... 00:08:26.593 [2024-11-18 14:21:22.636963] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid328712 ] 00:08:26.854 [2024-11-18 14:21:22.842360] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:26.854 [2024-11-18 14:21:22.855158] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:26.854 [2024-11-18 14:21:22.907646] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:26.854 [2024-11-18 14:21:22.923987] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4414 *** 00:08:26.854 INFO: Running with entropic power schedule (0xFF, 100). 00:08:26.854 INFO: Seed: 4115906965 00:08:26.854 INFO: Loaded 1 modules (387495 inline 8-bit counters): 387495 [0x2acbe4c, 0x2b2a7f3), 00:08:26.854 INFO: Loaded 1 PC tables (387495 PCs): 387495 [0x2b2a7f8,0x3114268), 00:08:26.854 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:08:26.854 INFO: A corpus is not provided, starting from an empty corpus 00:08:26.854 #2 INITED exec/s: 0 rss: 64Mb 00:08:26.854 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:26.854 This may also happen if the target rejected all inputs we tried so far 00:08:27.114 [2024-11-18 14:21:22.989488] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.114 [2024-11-18 14:21:22.989517] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.114 [2024-11-18 14:21:22.989579] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.114 [2024-11-18 14:21:22.989595] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:27.374 NEW_FUNC[1/716]: 0x46d788 in fuzz_admin_set_features_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:392 00:08:27.374 NEW_FUNC[2/716]: 0x495e28 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:27.374 #3 NEW cov: 12202 ft: 12203 corp: 2/15b lim: 35 exec/s: 0 rss: 72Mb L: 14/14 MS: 1 InsertRepeatedBytes- 00:08:27.374 NEW_FUNC[1/4]: 0x488168 in feat_arbitration /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:273 00:08:27.374 NEW_FUNC[2/4]: 0x48ecd8 in feat_write_atomicity /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:340 00:08:27.374 #9 NEW cov: 12422 ft: 13167 corp: 3/44b lim: 35 exec/s: 0 rss: 72Mb L: 29/29 MS: 1 InsertRepeatedBytes- 00:08:27.374 [2024-11-18 14:21:23.370304] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:0000002c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.374 [2024-11-18 14:21:23.370335] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.374 #13 NEW cov: 12435 ft: 14082 corp: 4/53b lim: 35 exec/s: 0 rss: 72Mb L: 9/29 MS: 4 InsertByte-InsertByte-ChangeBit-InsertRepeatedBytes- 00:08:27.374 [2024-11-18 14:21:23.410422] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:0000002c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.374 [2024-11-18 14:21:23.410447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.374 #14 NEW cov: 12520 ft: 14456 corp: 5/62b lim: 35 exec/s: 0 rss: 72Mb L: 9/29 MS: 1 CopyPart- 00:08:27.374 [2024-11-18 14:21:23.470797] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:8000007f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.374 [2024-11-18 14:21:23.470823] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.374 [2024-11-18 14:21:23.470884] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.374 [2024-11-18 14:21:23.470900] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:27.634 #15 NEW cov: 12520 ft: 14595 corp: 6/76b lim: 35 exec/s: 0 rss: 72Mb L: 14/29 MS: 1 ChangeBit- 00:08:27.634 #16 NEW cov: 12520 ft: 14676 corp: 7/109b lim: 35 exec/s: 0 rss: 72Mb L: 33/33 MS: 1 CMP- DE: "\022\000\000\000"- 00:08:27.634 [2024-11-18 14:21:23.590915] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:0000002c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.634 [2024-11-18 14:21:23.590940] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.634 #17 NEW cov: 12520 ft: 14748 corp: 8/118b lim: 35 exec/s: 0 rss: 72Mb L: 9/33 MS: 1 ShuffleBytes- 00:08:27.634 [2024-11-18 14:21:23.651118] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.634 [2024-11-18 14:21:23.651145] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.634 #18 NEW cov: 12520 ft: 14800 corp: 9/128b lim: 35 exec/s: 0 rss: 73Mb L: 10/33 MS: 1 EraseBytes- 00:08:27.634 #19 NEW cov: 12520 ft: 14939 corp: 10/161b lim: 35 exec/s: 0 rss: 73Mb L: 33/33 MS: 1 PersAutoDict- DE: "\022\000\000\000"- 00:08:27.634 [2024-11-18 14:21:23.731869] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES ARBITRATION cid:6 cdw10:80000001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.634 [2024-11-18 14:21:23.731899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:27.895 #20 NEW cov: 12520 ft: 15020 corp: 11/194b lim: 35 exec/s: 0 rss: 73Mb L: 33/33 MS: 1 ChangeBinInt- 00:08:27.895 [2024-11-18 14:21:23.791460] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.895 [2024-11-18 14:21:23.791486] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.895 #21 NEW cov: 12520 ft: 15060 corp: 12/201b lim: 35 exec/s: 0 rss: 73Mb L: 7/33 MS: 1 EraseBytes- 00:08:27.895 [2024-11-18 14:21:23.852237] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES ARBITRATION cid:6 cdw10:80000001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.895 [2024-11-18 14:21:23.852265] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:27.895 [2024-11-18 14:21:23.852324] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.895 [2024-11-18 14:21:23.852337] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:27.895 NEW_FUNC[1/1]: 0x1c52b68 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:27.895 #22 NEW cov: 12543 ft: 15169 corp: 13/234b lim: 35 exec/s: 0 rss: 73Mb L: 33/33 MS: 1 PersAutoDict- DE: "\022\000\000\000"- 00:08:27.895 #23 NEW cov: 12543 ft: 15267 corp: 14/263b lim: 35 exec/s: 0 rss: 73Mb L: 29/33 MS: 1 ChangeBit- 00:08:27.895 [2024-11-18 14:21:23.951921] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.895 [2024-11-18 14:21:23.951947] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.895 #26 NEW cov: 12543 ft: 15310 corp: 15/270b lim: 35 exec/s: 26 rss: 73Mb L: 7/33 MS: 3 EraseBytes-CMP-InsertByte- DE: "\027\000\000\000"- 00:08:27.895 [2024-11-18 14:21:24.012316] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.895 [2024-11-18 14:21:24.012342] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.895 [2024-11-18 14:21:24.012416] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:5 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.895 [2024-11-18 14:21:24.012433] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:28.157 #27 NEW cov: 12543 ft: 15360 corp: 16/288b lim: 35 exec/s: 27 rss: 73Mb L: 18/33 MS: 1 CopyPart- 00:08:28.157 [2024-11-18 14:21:24.052781] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES ARBITRATION cid:6 cdw10:80000001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.157 [2024-11-18 14:21:24.052807] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:28.157 [2024-11-18 14:21:24.052888] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.157 [2024-11-18 14:21:24.052903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:28.157 #28 NEW cov: 12543 ft: 15383 corp: 17/321b lim: 35 exec/s: 28 rss: 73Mb L: 33/33 MS: 1 PersAutoDict- DE: "\022\000\000\000"- 00:08:28.157 [2024-11-18 14:21:24.112985] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES ARBITRATION cid:6 cdw10:80000001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.157 [2024-11-18 14:21:24.113012] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:28.157 [2024-11-18 14:21:24.113070] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.157 [2024-11-18 14:21:24.113087] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:28.157 #29 NEW cov: 12543 ft: 15439 corp: 18/354b lim: 35 exec/s: 29 rss: 73Mb L: 33/33 MS: 1 CopyPart- 00:08:28.157 [2024-11-18 14:21:24.152836] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.157 [2024-11-18 14:21:24.152863] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:28.157 [2024-11-18 14:21:24.152936] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:5 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.157 [2024-11-18 14:21:24.152953] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:28.157 [2024-11-18 14:21:24.153015] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.157 [2024-11-18 14:21:24.153031] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:28.157 #30 NEW cov: 12543 ft: 15616 corp: 19/377b lim: 35 exec/s: 30 rss: 73Mb L: 23/33 MS: 1 CrossOver- 00:08:28.157 [2024-11-18 14:21:24.213034] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.157 [2024-11-18 14:21:24.213061] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:28.157 [2024-11-18 14:21:24.213119] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:5 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.157 [2024-11-18 14:21:24.213136] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:28.157 [2024-11-18 14:21:24.213196] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:8000005d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.157 [2024-11-18 14:21:24.213212] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:28.157 #31 NEW cov: 12543 ft: 15635 corp: 20/401b lim: 35 exec/s: 31 rss: 73Mb L: 24/33 MS: 1 InsertByte- 00:08:28.157 [2024-11-18 14:21:24.272861] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:0000002c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.157 [2024-11-18 14:21:24.272886] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:28.418 #32 NEW cov: 12543 ft: 15643 corp: 21/410b lim: 35 exec/s: 32 rss: 73Mb L: 9/33 MS: 1 PersAutoDict- DE: "\022\000\000\000"- 00:08:28.418 [2024-11-18 14:21:24.313555] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000021 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.418 [2024-11-18 14:21:24.313580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:28.418 #33 NEW cov: 12543 ft: 15658 corp: 22/439b lim: 35 exec/s: 33 rss: 73Mb L: 29/33 MS: 1 ChangeBit- 00:08:28.418 [2024-11-18 14:21:24.353439] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES ARBITRATION cid:6 cdw10:80000001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.418 [2024-11-18 14:21:24.353465] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:28.418 #34 NEW cov: 12543 ft: 15669 corp: 23/465b lim: 35 exec/s: 34 rss: 73Mb L: 26/33 MS: 1 EraseBytes- 00:08:28.418 [2024-11-18 14:21:24.413377] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000d1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.418 [2024-11-18 14:21:24.413403] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:28.418 [2024-11-18 14:21:24.413469] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.418 [2024-11-18 14:21:24.413485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:28.418 #35 NEW cov: 12543 ft: 15684 corp: 24/479b lim: 35 exec/s: 35 rss: 74Mb L: 14/33 MS: 1 ChangeByte- 00:08:28.418 [2024-11-18 14:21:24.473433] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:0000002c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.418 [2024-11-18 14:21:24.473458] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:28.418 #36 NEW cov: 12543 ft: 15707 corp: 25/489b lim: 35 exec/s: 36 rss: 74Mb L: 10/33 MS: 1 CopyPart- 00:08:28.418 [2024-11-18 14:21:24.514088] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES ARBITRATION cid:6 cdw10:80000001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.418 [2024-11-18 14:21:24.514113] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:28.418 [2024-11-18 14:21:24.514172] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.418 [2024-11-18 14:21:24.514186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:28.418 #37 NEW cov: 12543 ft: 15713 corp: 26/522b lim: 35 exec/s: 37 rss: 74Mb L: 33/33 MS: 1 ChangeBit- 00:08:28.679 [2024-11-18 14:21:24.554182] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:80000027 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.679 [2024-11-18 14:21:24.554209] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:28.679 [2024-11-18 14:21:24.554281] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.679 [2024-11-18 14:21:24.554296] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:28.679 #38 NEW cov: 12543 ft: 15755 corp: 27/555b lim: 35 exec/s: 38 rss: 74Mb L: 33/33 MS: 1 ChangeByte- 00:08:28.679 [2024-11-18 14:21:24.614181] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.679 [2024-11-18 14:21:24.614207] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:28.679 [2024-11-18 14:21:24.614266] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:5 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.679 [2024-11-18 14:21:24.614281] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:28.679 [2024-11-18 14:21:24.614340] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.679 [2024-11-18 14:21:24.614356] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:28.679 #39 NEW cov: 12543 ft: 15770 corp: 28/580b lim: 35 exec/s: 39 rss: 74Mb L: 25/33 MS: 1 InsertRepeatedBytes- 00:08:28.679 [2024-11-18 14:21:24.654236] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.679 [2024-11-18 14:21:24.654263] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:28.679 [2024-11-18 14:21:24.654321] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.680 [2024-11-18 14:21:24.654337] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:28.680 [2024-11-18 14:21:24.654400] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.680 [2024-11-18 14:21:24.654417] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:28.680 #40 NEW cov: 12543 ft: 15786 corp: 29/605b lim: 35 exec/s: 40 rss: 74Mb L: 25/33 MS: 1 CopyPart- 00:08:28.680 #41 NEW cov: 12543 ft: 15800 corp: 30/638b lim: 35 exec/s: 41 rss: 74Mb L: 33/33 MS: 1 ChangeByte- 00:08:28.680 [2024-11-18 14:21:24.734139] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:0000002c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.680 [2024-11-18 14:21:24.734164] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:28.680 #42 NEW cov: 12543 ft: 15801 corp: 31/647b lim: 35 exec/s: 42 rss: 74Mb L: 9/33 MS: 1 CMP- DE: "\001\004\000\000\000\000\000\000"- 00:08:28.680 [2024-11-18 14:21:24.774267] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:80000032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.680 [2024-11-18 14:21:24.774294] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:28.941 #43 NEW cov: 12543 ft: 15811 corp: 32/654b lim: 35 exec/s: 43 rss: 74Mb L: 7/33 MS: 1 ChangeBinInt- 00:08:28.941 [2024-11-18 14:21:24.835033] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES ARBITRATION cid:6 cdw10:80000001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.941 [2024-11-18 14:21:24.835059] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:28.941 [2024-11-18 14:21:24.835140] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.941 [2024-11-18 14:21:24.835155] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:28.941 #44 NEW cov: 12543 ft: 15823 corp: 33/687b lim: 35 exec/s: 44 rss: 74Mb L: 33/33 MS: 1 ChangeBit- 00:08:28.941 #45 NEW cov: 12543 ft: 15861 corp: 34/698b lim: 35 exec/s: 45 rss: 74Mb L: 11/33 MS: 1 CrossOver- 00:08:28.941 [2024-11-18 14:21:24.934720] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:0000002c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.941 [2024-11-18 14:21:24.934744] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:28.941 #46 NEW cov: 12543 ft: 15868 corp: 35/706b lim: 35 exec/s: 46 rss: 74Mb L: 8/33 MS: 1 EraseBytes- 00:08:28.941 #47 NEW cov: 12543 ft: 15899 corp: 36/735b lim: 35 exec/s: 23 rss: 74Mb L: 29/33 MS: 1 ShuffleBytes- 00:08:28.941 #47 DONE cov: 12543 ft: 15899 corp: 36/735b lim: 35 exec/s: 23 rss: 74Mb 00:08:28.941 ###### Recommended dictionary. ###### 00:08:28.941 "\022\000\000\000" # Uses: 4 00:08:28.941 "\027\000\000\000" # Uses: 0 00:08:28.941 "\001\004\000\000\000\000\000\000" # Uses: 0 00:08:28.941 ###### End of recommended dictionary. ###### 00:08:28.941 Done 47 runs in 2 second(s) 00:08:29.201 14:21:25 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_14.conf /var/tmp/suppress_nvmf_fuzz 00:08:29.201 14:21:25 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:29.201 14:21:25 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:29.201 14:21:25 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 15 1 0x1 00:08:29.201 14:21:25 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=15 00:08:29.202 14:21:25 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:29.202 14:21:25 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:29.202 14:21:25 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:08:29.202 14:21:25 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_15.conf 00:08:29.202 14:21:25 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:29.202 14:21:25 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:29.202 14:21:25 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 15 00:08:29.202 14:21:25 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4415 00:08:29.202 14:21:25 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:08:29.202 14:21:25 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4415' 00:08:29.202 14:21:25 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4415"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:29.202 14:21:25 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:29.202 14:21:25 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:29.202 14:21:25 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4415' -c /tmp/fuzz_json_15.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 -Z 15 00:08:29.202 [2024-11-18 14:21:25.148221] Starting SPDK v25.01-pre git sha1 d47eb51c9 / DPDK 23.11.0 initialization... 00:08:29.202 [2024-11-18 14:21:25.148285] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid329126 ] 00:08:29.462 [2024-11-18 14:21:25.340340] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:29.462 [2024-11-18 14:21:25.352155] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:29.463 [2024-11-18 14:21:25.404456] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:29.463 [2024-11-18 14:21:25.420793] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4415 *** 00:08:29.463 INFO: Running with entropic power schedule (0xFF, 100). 00:08:29.463 INFO: Seed: 2316946265 00:08:29.463 INFO: Loaded 1 modules (387495 inline 8-bit counters): 387495 [0x2acbe4c, 0x2b2a7f3), 00:08:29.463 INFO: Loaded 1 PC tables (387495 PCs): 387495 [0x2b2a7f8,0x3114268), 00:08:29.463 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:08:29.463 INFO: A corpus is not provided, starting from an empty corpus 00:08:29.463 #2 INITED exec/s: 0 rss: 66Mb 00:08:29.463 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:29.463 This may also happen if the target rejected all inputs we tried so far 00:08:29.463 [2024-11-18 14:21:25.497484] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000005be SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.463 [2024-11-18 14:21:25.497522] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:29.463 [2024-11-18 14:21:25.497659] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000005be SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.463 [2024-11-18 14:21:25.497679] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:29.722 NEW_FUNC[1/716]: 0x46ecc8 in fuzz_admin_get_features_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:460 00:08:29.723 NEW_FUNC[2/716]: 0x48ecd8 in feat_write_atomicity /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:340 00:08:29.723 #14 NEW cov: 12204 ft: 12205 corp: 2/27b lim: 35 exec/s: 0 rss: 72Mb L: 26/26 MS: 2 CrossOver-InsertRepeatedBytes- 00:08:29.723 [2024-11-18 14:21:25.837567] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000005be SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.723 [2024-11-18 14:21:25.837624] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:29.723 [2024-11-18 14:21:25.837719] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000005be SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.723 [2024-11-18 14:21:25.837746] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:29.982 #15 NEW cov: 12334 ft: 12885 corp: 3/53b lim: 35 exec/s: 0 rss: 72Mb L: 26/26 MS: 1 ChangeBinInt- 00:08:29.982 [2024-11-18 14:21:25.897386] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.982 [2024-11-18 14:21:25.897411] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.982 [2024-11-18 14:21:25.897485] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.982 [2024-11-18 14:21:25.897499] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:29.982 [2024-11-18 14:21:25.897558] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.982 [2024-11-18 14:21:25.897572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:29.982 #16 NEW cov: 12340 ft: 13390 corp: 4/80b lim: 35 exec/s: 0 rss: 72Mb L: 27/27 MS: 1 InsertRepeatedBytes- 00:08:29.982 [2024-11-18 14:21:25.937377] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000005be SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.982 [2024-11-18 14:21:25.937402] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:29.982 #22 NEW cov: 12425 ft: 13893 corp: 5/99b lim: 35 exec/s: 0 rss: 72Mb L: 19/27 MS: 1 EraseBytes- 00:08:29.982 [2024-11-18 14:21:25.977798] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.982 [2024-11-18 14:21:25.977823] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:29.982 [2024-11-18 14:21:25.977883] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.982 [2024-11-18 14:21:25.977896] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:29.982 [2024-11-18 14:21:25.977952] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.982 [2024-11-18 14:21:25.977966] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:29.982 #24 NEW cov: 12425 ft: 14341 corp: 6/128b lim: 35 exec/s: 0 rss: 72Mb L: 29/29 MS: 2 CrossOver-InsertRepeatedBytes- 00:08:29.982 [2024-11-18 14:21:26.017884] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000005be SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.982 [2024-11-18 14:21:26.017909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:29.982 [2024-11-18 14:21:26.017969] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000005be SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.982 [2024-11-18 14:21:26.017983] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:29.982 [2024-11-18 14:21:26.018042] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000005be SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.982 [2024-11-18 14:21:26.018055] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:29.982 #25 NEW cov: 12425 ft: 14407 corp: 7/161b lim: 35 exec/s: 0 rss: 72Mb L: 33/33 MS: 1 CopyPart- 00:08:29.982 [2024-11-18 14:21:26.057882] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000005be SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.982 [2024-11-18 14:21:26.057907] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:29.982 [2024-11-18 14:21:26.057966] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000005be SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.982 [2024-11-18 14:21:26.057980] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:29.982 #26 NEW cov: 12425 ft: 14500 corp: 8/186b lim: 35 exec/s: 0 rss: 72Mb L: 25/33 MS: 1 EraseBytes- 00:08:30.242 [2024-11-18 14:21:26.117758] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000005be SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.242 [2024-11-18 14:21:26.117783] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:30.242 #27 NEW cov: 12425 ft: 14694 corp: 9/199b lim: 35 exec/s: 0 rss: 72Mb L: 13/33 MS: 1 EraseBytes- 00:08:30.242 [2024-11-18 14:21:26.178156] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000005be SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.242 [2024-11-18 14:21:26.178181] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:30.242 [2024-11-18 14:21:26.178240] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.242 [2024-11-18 14:21:26.178254] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:30.242 [2024-11-18 14:21:26.178311] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000005be SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.242 [2024-11-18 14:21:26.178324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:30.242 #28 NEW cov: 12425 ft: 14780 corp: 10/221b lim: 35 exec/s: 0 rss: 72Mb L: 22/33 MS: 1 InsertRepeatedBytes- 00:08:30.242 [2024-11-18 14:21:26.238509] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.242 [2024-11-18 14:21:26.238533] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:30.242 [2024-11-18 14:21:26.238613] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.242 [2024-11-18 14:21:26.238628] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:30.242 [2024-11-18 14:21:26.238688] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.242 [2024-11-18 14:21:26.238701] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:30.242 #29 NEW cov: 12425 ft: 14813 corp: 11/251b lim: 35 exec/s: 0 rss: 72Mb L: 30/33 MS: 1 InsertByte- 00:08:30.242 [2024-11-18 14:21:26.298347] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000012f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.242 [2024-11-18 14:21:26.298371] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:30.242 [2024-11-18 14:21:26.298428] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000133 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.242 [2024-11-18 14:21:26.298441] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:30.242 #33 NEW cov: 12425 ft: 14862 corp: 12/266b lim: 35 exec/s: 0 rss: 72Mb L: 15/33 MS: 4 InsertByte-ChangeASCIIInt-ChangeByte-InsertRepeatedBytes- 00:08:30.242 [2024-11-18 14:21:26.338358] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000005be SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.242 [2024-11-18 14:21:26.338383] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:30.242 NEW_FUNC[1/1]: 0x1c52b68 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:30.242 #34 NEW cov: 12448 ft: 14898 corp: 13/279b lim: 35 exec/s: 0 rss: 73Mb L: 13/33 MS: 1 ShuffleBytes- 00:08:30.502 [2024-11-18 14:21:26.378715] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000041 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.502 [2024-11-18 14:21:26.378741] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:30.502 [2024-11-18 14:21:26.378800] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000005be SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.502 [2024-11-18 14:21:26.378814] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:30.502 [2024-11-18 14:21:26.378886] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000005be SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.502 [2024-11-18 14:21:26.378900] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:30.502 #35 NEW cov: 12448 ft: 14906 corp: 14/306b lim: 35 exec/s: 0 rss: 73Mb L: 27/33 MS: 1 InsertByte- 00:08:30.502 [2024-11-18 14:21:26.418875] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000005be SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.502 [2024-11-18 14:21:26.418900] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:30.502 [2024-11-18 14:21:26.418959] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000005be SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.502 [2024-11-18 14:21:26.418973] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:30.502 #36 NEW cov: 12448 ft: 14947 corp: 15/332b lim: 35 exec/s: 0 rss: 73Mb L: 26/33 MS: 1 ShuffleBytes- 00:08:30.502 [2024-11-18 14:21:26.458719] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000005be SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.502 [2024-11-18 14:21:26.458743] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:30.502 #37 NEW cov: 12448 ft: 14996 corp: 16/345b lim: 35 exec/s: 37 rss: 73Mb L: 13/33 MS: 1 ChangeBinInt- 00:08:30.502 [2024-11-18 14:21:26.519174] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000005be SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.502 [2024-11-18 14:21:26.519200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:30.502 [2024-11-18 14:21:26.519271] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000005be SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.502 [2024-11-18 14:21:26.519286] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:30.502 #38 NEW cov: 12448 ft: 15007 corp: 17/371b lim: 35 exec/s: 38 rss: 73Mb L: 26/33 MS: 1 ChangeByte- 00:08:30.502 [2024-11-18 14:21:26.559249] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000005be SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.502 [2024-11-18 14:21:26.559273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:30.502 [2024-11-18 14:21:26.559331] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.502 [2024-11-18 14:21:26.559345] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:30.502 [2024-11-18 14:21:26.559406] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000005be SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.502 [2024-11-18 14:21:26.559420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:30.502 #39 NEW cov: 12448 ft: 15048 corp: 18/393b lim: 35 exec/s: 39 rss: 73Mb L: 22/33 MS: 1 ChangeBit- 00:08:30.502 [2024-11-18 14:21:26.619410] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000005be SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.502 [2024-11-18 14:21:26.619435] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:30.502 [2024-11-18 14:21:26.619507] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000005be SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.502 [2024-11-18 14:21:26.619521] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:30.763 #40 NEW cov: 12448 ft: 15075 corp: 19/419b lim: 35 exec/s: 40 rss: 73Mb L: 26/33 MS: 1 ChangeBinInt- 00:08:30.763 [2024-11-18 14:21:26.679595] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000005be SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.763 [2024-11-18 14:21:26.679620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:30.763 [2024-11-18 14:21:26.679706] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000005be SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.763 [2024-11-18 14:21:26.679720] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:30.763 #41 NEW cov: 12448 ft: 15082 corp: 20/444b lim: 35 exec/s: 41 rss: 73Mb L: 25/33 MS: 1 ChangeByte- 00:08:30.763 [2024-11-18 14:21:26.739888] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000005be SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.763 [2024-11-18 14:21:26.739913] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:30.763 [2024-11-18 14:21:26.739968] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000005be SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.763 [2024-11-18 14:21:26.739982] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:30.763 [2024-11-18 14:21:26.740039] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000005be SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.763 [2024-11-18 14:21:26.740052] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:30.763 #42 NEW cov: 12448 ft: 15094 corp: 21/477b lim: 35 exec/s: 42 rss: 73Mb L: 33/33 MS: 1 CopyPart- 00:08:30.763 [2024-11-18 14:21:26.800086] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.763 [2024-11-18 14:21:26.800110] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:30.763 [2024-11-18 14:21:26.800186] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.763 [2024-11-18 14:21:26.800201] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:30.763 [2024-11-18 14:21:26.800259] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.763 [2024-11-18 14:21:26.800273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:30.763 #43 NEW cov: 12448 ft: 15105 corp: 22/507b lim: 35 exec/s: 43 rss: 73Mb L: 30/33 MS: 1 ChangeBit- 00:08:30.763 [2024-11-18 14:21:26.860075] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.763 [2024-11-18 14:21:26.860100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:30.763 [2024-11-18 14:21:26.860175] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.763 [2024-11-18 14:21:26.860191] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:30.763 [2024-11-18 14:21:26.860250] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000000ba SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.763 [2024-11-18 14:21:26.860263] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:31.024 #44 NEW cov: 12448 ft: 15110 corp: 23/534b lim: 35 exec/s: 44 rss: 73Mb L: 27/33 MS: 1 ChangeByte- 00:08:31.024 [2024-11-18 14:21:26.920241] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000005be SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.024 [2024-11-18 14:21:26.920265] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:31.024 [2024-11-18 14:21:26.920339] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.024 [2024-11-18 14:21:26.920354] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:31.024 #45 NEW cov: 12448 ft: 15120 corp: 24/560b lim: 35 exec/s: 45 rss: 73Mb L: 26/33 MS: 1 ChangeBinInt- 00:08:31.024 [2024-11-18 14:21:26.980262] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000005be SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.024 [2024-11-18 14:21:26.980288] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:31.024 #46 NEW cov: 12448 ft: 15144 corp: 25/578b lim: 35 exec/s: 46 rss: 73Mb L: 18/33 MS: 1 CrossOver- 00:08:31.024 [2024-11-18 14:21:27.020689] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000005be SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.024 [2024-11-18 14:21:27.020715] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:31.024 [2024-11-18 14:21:27.020773] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000005be SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.024 [2024-11-18 14:21:27.020788] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:31.024 [2024-11-18 14:21:27.020849] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000005be SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.024 [2024-11-18 14:21:27.020863] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:31.024 #47 NEW cov: 12448 ft: 15201 corp: 26/611b lim: 35 exec/s: 47 rss: 73Mb L: 33/33 MS: 1 ShuffleBytes- 00:08:31.024 [2024-11-18 14:21:27.060343] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000005be SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.024 [2024-11-18 14:21:27.060369] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:31.024 #48 NEW cov: 12448 ft: 15212 corp: 27/624b lim: 35 exec/s: 48 rss: 73Mb L: 13/33 MS: 1 ShuffleBytes- 00:08:31.024 [2024-11-18 14:21:27.100805] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000005be SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.024 [2024-11-18 14:21:27.100831] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:31.024 [2024-11-18 14:21:27.100893] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000005be SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.025 [2024-11-18 14:21:27.100907] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:31.025 #49 NEW cov: 12448 ft: 15218 corp: 28/650b lim: 35 exec/s: 49 rss: 73Mb L: 26/33 MS: 1 CrossOver- 00:08:31.025 [2024-11-18 14:21:27.140875] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000005be SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.025 [2024-11-18 14:21:27.140901] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:31.025 [2024-11-18 14:21:27.140960] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000005ae SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.025 [2024-11-18 14:21:27.140974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:31.285 #50 NEW cov: 12448 ft: 15244 corp: 29/676b lim: 35 exec/s: 50 rss: 74Mb L: 26/33 MS: 1 ChangeByte- 00:08:31.285 [2024-11-18 14:21:27.200853] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000012f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.285 [2024-11-18 14:21:27.200878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:31.285 [2024-11-18 14:21:27.200953] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000133 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.285 [2024-11-18 14:21:27.200968] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:31.285 #51 NEW cov: 12448 ft: 15264 corp: 30/691b lim: 35 exec/s: 51 rss: 74Mb L: 15/33 MS: 1 ChangeBit- 00:08:31.285 [2024-11-18 14:21:27.261145] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000005be SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.285 [2024-11-18 14:21:27.261170] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:31.285 [2024-11-18 14:21:27.261229] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.285 [2024-11-18 14:21:27.261243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:31.285 [2024-11-18 14:21:27.261316] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000005be SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.285 [2024-11-18 14:21:27.261331] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:31.285 #52 NEW cov: 12448 ft: 15342 corp: 31/713b lim: 35 exec/s: 52 rss: 74Mb L: 22/33 MS: 1 ShuffleBytes- 00:08:31.285 [2024-11-18 14:21:27.301230] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000005be SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.285 [2024-11-18 14:21:27.301255] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:31.285 #53 NEW cov: 12448 ft: 15353 corp: 32/731b lim: 35 exec/s: 53 rss: 74Mb L: 18/33 MS: 1 ChangeByte- 00:08:31.285 [2024-11-18 14:21:27.361538] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000005be SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.285 [2024-11-18 14:21:27.361568] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:31.285 [2024-11-18 14:21:27.361642] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000005be SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.285 [2024-11-18 14:21:27.361657] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:31.285 #54 NEW cov: 12448 ft: 15359 corp: 33/757b lim: 35 exec/s: 54 rss: 74Mb L: 26/33 MS: 1 CrossOver- 00:08:31.285 [2024-11-18 14:21:27.401733] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.285 [2024-11-18 14:21:27.401758] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:31.285 [2024-11-18 14:21:27.401832] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.285 [2024-11-18 14:21:27.401845] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:31.285 [2024-11-18 14:21:27.401904] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.285 [2024-11-18 14:21:27.401917] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:31.546 #55 NEW cov: 12448 ft: 15387 corp: 34/787b lim: 35 exec/s: 55 rss: 74Mb L: 30/33 MS: 1 ChangeBinInt- 00:08:31.546 [2024-11-18 14:21:27.461479] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000005be SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.546 [2024-11-18 14:21:27.461504] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:31.546 #56 NEW cov: 12448 ft: 15425 corp: 35/798b lim: 35 exec/s: 28 rss: 74Mb L: 11/33 MS: 1 EraseBytes- 00:08:31.546 #56 DONE cov: 12448 ft: 15425 corp: 35/798b lim: 35 exec/s: 28 rss: 74Mb 00:08:31.546 Done 56 runs in 2 second(s) 00:08:31.546 14:21:27 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_15.conf /var/tmp/suppress_nvmf_fuzz 00:08:31.546 14:21:27 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:31.546 14:21:27 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:31.546 14:21:27 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 16 1 0x1 00:08:31.546 14:21:27 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=16 00:08:31.546 14:21:27 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:31.546 14:21:27 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:31.546 14:21:27 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:08:31.546 14:21:27 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_16.conf 00:08:31.546 14:21:27 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:31.546 14:21:27 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:31.546 14:21:27 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 16 00:08:31.546 14:21:27 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4416 00:08:31.546 14:21:27 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:08:31.546 14:21:27 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4416' 00:08:31.546 14:21:27 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4416"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:31.546 14:21:27 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:31.546 14:21:27 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:31.546 14:21:27 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4416' -c /tmp/fuzz_json_16.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 -Z 16 00:08:31.546 [2024-11-18 14:21:27.626806] Starting SPDK v25.01-pre git sha1 d47eb51c9 / DPDK 23.11.0 initialization... 00:08:31.546 [2024-11-18 14:21:27.626894] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid329659 ] 00:08:31.807 [2024-11-18 14:21:27.827954] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:31.807 [2024-11-18 14:21:27.840246] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:31.807 [2024-11-18 14:21:27.892701] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:31.807 [2024-11-18 14:21:27.909035] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4416 *** 00:08:31.807 INFO: Running with entropic power schedule (0xFF, 100). 00:08:31.807 INFO: Seed: 509970506 00:08:32.068 INFO: Loaded 1 modules (387495 inline 8-bit counters): 387495 [0x2acbe4c, 0x2b2a7f3), 00:08:32.068 INFO: Loaded 1 PC tables (387495 PCs): 387495 [0x2b2a7f8,0x3114268), 00:08:32.068 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:08:32.068 INFO: A corpus is not provided, starting from an empty corpus 00:08:32.068 #2 INITED exec/s: 0 rss: 65Mb 00:08:32.068 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:32.068 This may also happen if the target rejected all inputs we tried so far 00:08:32.068 [2024-11-18 14:21:27.974897] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:32.068 [2024-11-18 14:21:27.974938] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.328 NEW_FUNC[1/715]: 0x470188 in fuzz_nvm_read_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:519 00:08:32.328 NEW_FUNC[2/715]: 0x495e28 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:32.328 #4 NEW cov: 12282 ft: 12296 corp: 2/23b lim: 105 exec/s: 0 rss: 72Mb L: 22/22 MS: 2 CopyPart-InsertRepeatedBytes- 00:08:32.328 [2024-11-18 14:21:28.305858] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:32.328 [2024-11-18 14:21:28.305892] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.328 NEW_FUNC[1/1]: 0x19689c8 in _nvme_qpair_complete_abort_queued_reqs /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_qpair.c:593 00:08:32.328 #5 NEW cov: 12425 ft: 13002 corp: 3/51b lim: 105 exec/s: 0 rss: 72Mb L: 28/28 MS: 1 InsertRepeatedBytes- 00:08:32.328 [2024-11-18 14:21:28.376012] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:32.328 [2024-11-18 14:21:28.376039] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.328 #6 NEW cov: 12431 ft: 13205 corp: 4/74b lim: 105 exec/s: 0 rss: 72Mb L: 23/28 MS: 1 InsertByte- 00:08:32.328 [2024-11-18 14:21:28.416171] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:32.328 [2024-11-18 14:21:28.416201] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.328 [2024-11-18 14:21:28.416321] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744069415305215 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:32.328 [2024-11-18 14:21:28.416345] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.589 #7 NEW cov: 12516 ft: 13921 corp: 5/119b lim: 105 exec/s: 0 rss: 72Mb L: 45/45 MS: 1 CopyPart- 00:08:32.589 [2024-11-18 14:21:28.486236] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:32.589 [2024-11-18 14:21:28.486265] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.589 #13 NEW cov: 12516 ft: 13987 corp: 6/142b lim: 105 exec/s: 0 rss: 72Mb L: 23/45 MS: 1 CopyPart- 00:08:32.589 [2024-11-18 14:21:28.536420] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:32.589 [2024-11-18 14:21:28.536455] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.589 #14 NEW cov: 12516 ft: 14063 corp: 7/165b lim: 105 exec/s: 0 rss: 73Mb L: 23/45 MS: 1 CopyPart- 00:08:32.589 [2024-11-18 14:21:28.596749] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:32.589 [2024-11-18 14:21:28.596785] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.589 [2024-11-18 14:21:28.596911] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744069415305215 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:32.589 [2024-11-18 14:21:28.596936] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.589 #15 NEW cov: 12516 ft: 14116 corp: 8/210b lim: 105 exec/s: 0 rss: 73Mb L: 45/45 MS: 1 ChangeBit- 00:08:32.589 [2024-11-18 14:21:28.656694] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:32.589 [2024-11-18 14:21:28.656729] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.589 #16 NEW cov: 12516 ft: 14200 corp: 9/233b lim: 105 exec/s: 0 rss: 73Mb L: 23/45 MS: 1 ChangeBit- 00:08:32.589 [2024-11-18 14:21:28.707131] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:32.589 [2024-11-18 14:21:28.707165] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.589 [2024-11-18 14:21:28.707299] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744069415305215 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:32.589 [2024-11-18 14:21:28.707324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.849 #17 NEW cov: 12516 ft: 14234 corp: 10/278b lim: 105 exec/s: 0 rss: 73Mb L: 45/45 MS: 1 ShuffleBytes- 00:08:32.849 [2024-11-18 14:21:28.757027] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446743137591230463 len:9767 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:32.849 [2024-11-18 14:21:28.757061] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.849 #18 NEW cov: 12516 ft: 14408 corp: 11/300b lim: 105 exec/s: 0 rss: 73Mb L: 22/45 MS: 1 EraseBytes- 00:08:32.849 [2024-11-18 14:21:28.827250] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069599132415 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:32.849 [2024-11-18 14:21:28.827283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.849 NEW_FUNC[1/1]: 0x1c52b68 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:32.849 #19 NEW cov: 12539 ft: 14485 corp: 12/328b lim: 105 exec/s: 0 rss: 73Mb L: 28/45 MS: 1 ChangeBinInt- 00:08:32.849 [2024-11-18 14:21:28.877447] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:32.849 [2024-11-18 14:21:28.877478] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.849 #20 NEW cov: 12539 ft: 14507 corp: 13/366b lim: 105 exec/s: 0 rss: 73Mb L: 38/45 MS: 1 EraseBytes- 00:08:32.849 [2024-11-18 14:21:28.947571] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:17408 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:32.849 [2024-11-18 14:21:28.947600] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.109 #21 NEW cov: 12539 ft: 14537 corp: 14/390b lim: 105 exec/s: 21 rss: 73Mb L: 24/45 MS: 1 InsertByte- 00:08:33.109 [2024-11-18 14:21:29.017796] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446743962224951295 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.109 [2024-11-18 14:21:29.017827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.109 #23 NEW cov: 12539 ft: 14589 corp: 15/418b lim: 105 exec/s: 23 rss: 73Mb L: 28/45 MS: 2 EraseBytes-CMP- DE: "\346\003\000\000\000\000\000\000"- 00:08:33.109 [2024-11-18 14:21:29.088193] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.109 [2024-11-18 14:21:29.088220] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.109 [2024-11-18 14:21:29.088351] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744069415305215 len:54272 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.109 [2024-11-18 14:21:29.088374] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.110 #24 NEW cov: 12539 ft: 14601 corp: 16/463b lim: 105 exec/s: 24 rss: 73Mb L: 45/45 MS: 1 ChangeByte- 00:08:33.110 [2024-11-18 14:21:29.158179] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:9767 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.110 [2024-11-18 14:21:29.158204] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.110 #25 NEW cov: 12539 ft: 14619 corp: 17/499b lim: 105 exec/s: 25 rss: 73Mb L: 36/45 MS: 1 CopyPart- 00:08:33.110 [2024-11-18 14:21:29.208392] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069599132415 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.110 [2024-11-18 14:21:29.208417] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.369 #26 NEW cov: 12539 ft: 14638 corp: 18/527b lim: 105 exec/s: 26 rss: 73Mb L: 28/45 MS: 1 ShuffleBytes- 00:08:33.369 [2024-11-18 14:21:29.278682] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.369 [2024-11-18 14:21:29.278715] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.369 [2024-11-18 14:21:29.278828] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446743137407401983 len:9767 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.369 [2024-11-18 14:21:29.278844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.369 #27 NEW cov: 12539 ft: 14709 corp: 19/582b lim: 105 exec/s: 27 rss: 73Mb L: 55/55 MS: 1 CrossOver- 00:08:33.369 [2024-11-18 14:21:29.328983] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:144680345827016703 len:515 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.369 [2024-11-18 14:21:29.329013] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.369 [2024-11-18 14:21:29.329087] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:144680345676153346 len:515 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.369 [2024-11-18 14:21:29.329117] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.369 [2024-11-18 14:21:29.329235] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:144680345676153346 len:515 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.369 [2024-11-18 14:21:29.329251] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.369 #28 NEW cov: 12539 ft: 15032 corp: 20/657b lim: 105 exec/s: 28 rss: 73Mb L: 75/75 MS: 1 InsertRepeatedBytes- 00:08:33.369 [2024-11-18 14:21:29.379097] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.369 [2024-11-18 14:21:29.379128] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.369 [2024-11-18 14:21:29.379223] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744069415305215 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.369 [2024-11-18 14:21:29.379244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.369 #29 NEW cov: 12539 ft: 15038 corp: 21/702b lim: 105 exec/s: 29 rss: 73Mb L: 45/75 MS: 1 ChangeBit- 00:08:33.369 [2024-11-18 14:21:29.429613] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.369 [2024-11-18 14:21:29.429644] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.369 [2024-11-18 14:21:29.429728] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744069415305215 len:57569 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.369 [2024-11-18 14:21:29.429748] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.369 [2024-11-18 14:21:29.429867] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:16204198715729174752 len:57569 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.369 [2024-11-18 14:21:29.429891] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.369 [2024-11-18 14:21:29.430005] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:16204198715729174752 len:57600 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.369 [2024-11-18 14:21:29.430024] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:33.369 #30 NEW cov: 12539 ft: 15537 corp: 22/792b lim: 105 exec/s: 30 rss: 73Mb L: 90/90 MS: 1 InsertRepeatedBytes- 00:08:33.369 [2024-11-18 14:21:29.469389] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.369 [2024-11-18 14:21:29.469424] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.369 [2024-11-18 14:21:29.469530] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744069415305215 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.369 [2024-11-18 14:21:29.469554] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.630 #31 NEW cov: 12539 ft: 15538 corp: 23/845b lim: 105 exec/s: 31 rss: 74Mb L: 53/90 MS: 1 CopyPart- 00:08:33.630 [2024-11-18 14:21:29.539949] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446742974382473215 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.630 [2024-11-18 14:21:29.539978] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.630 [2024-11-18 14:21:29.540085] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.630 [2024-11-18 14:21:29.540110] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.630 [2024-11-18 14:21:29.540229] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:256 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.630 [2024-11-18 14:21:29.540249] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.630 [2024-11-18 14:21:29.540366] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:3096224743817215 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.630 [2024-11-18 14:21:29.540386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:33.630 #32 NEW cov: 12539 ft: 15574 corp: 24/936b lim: 105 exec/s: 32 rss: 74Mb L: 91/91 MS: 1 InsertRepeatedBytes- 00:08:33.630 [2024-11-18 14:21:29.579588] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.630 [2024-11-18 14:21:29.579616] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.630 [2024-11-18 14:21:29.579735] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:16204198711957061631 len:57569 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.630 [2024-11-18 14:21:29.579754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.630 #33 NEW cov: 12539 ft: 15596 corp: 25/996b lim: 105 exec/s: 33 rss: 74Mb L: 60/91 MS: 1 EraseBytes- 00:08:33.630 [2024-11-18 14:21:29.650392] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:16348879057783349986 len:58083 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.630 [2024-11-18 14:21:29.650421] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.630 [2024-11-18 14:21:29.650503] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:16348879061405328098 len:58083 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.630 [2024-11-18 14:21:29.650523] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.630 [2024-11-18 14:21:29.650638] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:16348879061405328098 len:58083 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.630 [2024-11-18 14:21:29.650662] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.630 [2024-11-18 14:21:29.650777] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:16348879061405328098 len:58083 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.630 [2024-11-18 14:21:29.650800] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:33.630 [2024-11-18 14:21:29.650920] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:4 nsid:0 lba:16357073846121194210 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.630 [2024-11-18 14:21:29.650944] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:33.630 #36 NEW cov: 12539 ft: 15629 corp: 26/1101b lim: 105 exec/s: 36 rss: 74Mb L: 105/105 MS: 3 EraseBytes-CopyPart-InsertRepeatedBytes- 00:08:33.631 [2024-11-18 14:21:29.700076] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.631 [2024-11-18 14:21:29.700107] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.631 [2024-11-18 14:21:29.700234] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65291 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.631 [2024-11-18 14:21:29.700258] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.631 #37 NEW cov: 12539 ft: 15640 corp: 27/1145b lim: 105 exec/s: 37 rss: 74Mb L: 44/105 MS: 1 EraseBytes- 00:08:33.891 [2024-11-18 14:21:29.770097] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.891 [2024-11-18 14:21:29.770129] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.891 #38 NEW cov: 12539 ft: 15671 corp: 28/1168b lim: 105 exec/s: 38 rss: 74Mb L: 23/105 MS: 1 ShuffleBytes- 00:08:33.891 [2024-11-18 14:21:29.820186] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.891 [2024-11-18 14:21:29.820218] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.891 #39 NEW cov: 12539 ft: 15718 corp: 29/1192b lim: 105 exec/s: 39 rss: 74Mb L: 24/105 MS: 1 InsertByte- 00:08:33.891 [2024-11-18 14:21:29.890371] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:3096220633399295 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.891 [2024-11-18 14:21:29.890407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.891 #40 NEW cov: 12539 ft: 15736 corp: 30/1220b lim: 105 exec/s: 40 rss: 74Mb L: 28/105 MS: 1 EraseBytes- 00:08:33.891 [2024-11-18 14:21:29.941313] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:16348879057783349986 len:58083 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.891 [2024-11-18 14:21:29.941346] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.891 [2024-11-18 14:21:29.941429] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:16348879061405328098 len:58083 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.891 [2024-11-18 14:21:29.941450] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.891 [2024-11-18 14:21:29.941572] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:16348879061405328098 len:58083 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.892 [2024-11-18 14:21:29.941606] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.892 [2024-11-18 14:21:29.941723] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:16348879061405328098 len:58083 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.892 [2024-11-18 14:21:29.941748] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:33.892 [2024-11-18 14:21:29.941870] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:4 nsid:0 lba:16357073846121194210 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.892 [2024-11-18 14:21:29.941892] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:33.892 #41 NEW cov: 12539 ft: 15745 corp: 31/1325b lim: 105 exec/s: 20 rss: 74Mb L: 105/105 MS: 1 ChangeByte- 00:08:33.892 #41 DONE cov: 12539 ft: 15745 corp: 31/1325b lim: 105 exec/s: 20 rss: 74Mb 00:08:33.892 ###### Recommended dictionary. ###### 00:08:33.892 "\346\003\000\000\000\000\000\000" # Uses: 0 00:08:33.892 ###### End of recommended dictionary. ###### 00:08:33.892 Done 41 runs in 2 second(s) 00:08:34.152 14:21:30 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_16.conf /var/tmp/suppress_nvmf_fuzz 00:08:34.152 14:21:30 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:34.152 14:21:30 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:34.152 14:21:30 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 17 1 0x1 00:08:34.152 14:21:30 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=17 00:08:34.152 14:21:30 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:34.152 14:21:30 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:34.152 14:21:30 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:08:34.152 14:21:30 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_17.conf 00:08:34.152 14:21:30 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:34.152 14:21:30 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:34.152 14:21:30 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 17 00:08:34.152 14:21:30 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4417 00:08:34.152 14:21:30 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:08:34.152 14:21:30 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4417' 00:08:34.152 14:21:30 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4417"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:34.152 14:21:30 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:34.152 14:21:30 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:34.152 14:21:30 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4417' -c /tmp/fuzz_json_17.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 -Z 17 00:08:34.152 [2024-11-18 14:21:30.129089] Starting SPDK v25.01-pre git sha1 d47eb51c9 / DPDK 23.11.0 initialization... 00:08:34.152 [2024-11-18 14:21:30.129157] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid329956 ] 00:08:34.413 [2024-11-18 14:21:30.415651] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:34.413 [2024-11-18 14:21:30.436289] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:34.413 [2024-11-18 14:21:30.489173] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:34.413 [2024-11-18 14:21:30.505502] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4417 *** 00:08:34.413 INFO: Running with entropic power schedule (0xFF, 100). 00:08:34.413 INFO: Seed: 3104970548 00:08:34.673 INFO: Loaded 1 modules (387495 inline 8-bit counters): 387495 [0x2acbe4c, 0x2b2a7f3), 00:08:34.673 INFO: Loaded 1 PC tables (387495 PCs): 387495 [0x2b2a7f8,0x3114268), 00:08:34.673 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:08:34.673 INFO: A corpus is not provided, starting from an empty corpus 00:08:34.673 #2 INITED exec/s: 0 rss: 65Mb 00:08:34.673 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:34.673 This may also happen if the target rejected all inputs we tried so far 00:08:34.673 [2024-11-18 14:21:30.564327] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.673 [2024-11-18 14:21:30.564357] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.673 [2024-11-18 14:21:30.564405] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.673 [2024-11-18 14:21:30.564424] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.673 [2024-11-18 14:21:30.564478] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.673 [2024-11-18 14:21:30.564493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.934 NEW_FUNC[1/717]: 0x473508 in fuzz_nvm_write_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:540 00:08:34.934 NEW_FUNC[2/717]: 0x495e28 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:34.934 #9 NEW cov: 12329 ft: 12330 corp: 2/84b lim: 120 exec/s: 0 rss: 72Mb L: 83/83 MS: 2 ChangeByte-InsertRepeatedBytes- 00:08:34.934 [2024-11-18 14:21:30.895403] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:7740398492046814059 len:27500 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.934 [2024-11-18 14:21:30.895468] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.934 [2024-11-18 14:21:30.895581] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:7740398493674204011 len:27500 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.934 [2024-11-18 14:21:30.895616] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.934 [2024-11-18 14:21:30.895703] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:7740398493674204011 len:27500 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.934 [2024-11-18 14:21:30.895736] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.934 #15 NEW cov: 12446 ft: 13127 corp: 3/156b lim: 120 exec/s: 0 rss: 72Mb L: 72/83 MS: 1 InsertRepeatedBytes- 00:08:34.934 [2024-11-18 14:21:30.945170] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709497087 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.934 [2024-11-18 14:21:30.945197] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.934 [2024-11-18 14:21:30.945236] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.934 [2024-11-18 14:21:30.945251] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.934 [2024-11-18 14:21:30.945305] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.934 [2024-11-18 14:21:30.945320] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.934 #16 NEW cov: 12452 ft: 13318 corp: 4/239b lim: 120 exec/s: 0 rss: 72Mb L: 83/83 MS: 1 ChangeByte- 00:08:34.934 [2024-11-18 14:21:31.005682] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709497087 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.934 [2024-11-18 14:21:31.005710] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.934 [2024-11-18 14:21:31.005763] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.934 [2024-11-18 14:21:31.005779] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.934 [2024-11-18 14:21:31.005830] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.934 [2024-11-18 14:21:31.005848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.934 [2024-11-18 14:21:31.005901] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.934 [2024-11-18 14:21:31.005916] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:34.934 [2024-11-18 14:21:31.005970] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:4 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.934 [2024-11-18 14:21:31.005986] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:34.934 #17 NEW cov: 12537 ft: 13982 corp: 5/359b lim: 120 exec/s: 0 rss: 72Mb L: 120/120 MS: 1 CopyPart- 00:08:35.195 [2024-11-18 14:21:31.065560] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709497087 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.195 [2024-11-18 14:21:31.065589] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.195 [2024-11-18 14:21:31.065626] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.195 [2024-11-18 14:21:31.065642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.195 [2024-11-18 14:21:31.065699] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.195 [2024-11-18 14:21:31.065716] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.195 #18 NEW cov: 12537 ft: 14047 corp: 6/443b lim: 120 exec/s: 0 rss: 72Mb L: 84/120 MS: 1 InsertByte- 00:08:35.196 [2024-11-18 14:21:31.105606] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709497087 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.196 [2024-11-18 14:21:31.105631] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.196 [2024-11-18 14:21:31.105697] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.196 [2024-11-18 14:21:31.105713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.196 [2024-11-18 14:21:31.105767] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.196 [2024-11-18 14:21:31.105783] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.196 #19 NEW cov: 12537 ft: 14217 corp: 7/527b lim: 120 exec/s: 0 rss: 72Mb L: 84/120 MS: 1 InsertByte- 00:08:35.196 [2024-11-18 14:21:31.145733] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:7740398492046814059 len:27500 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.196 [2024-11-18 14:21:31.145759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.196 [2024-11-18 14:21:31.145796] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:7740398493724535659 len:27500 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.196 [2024-11-18 14:21:31.145812] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.196 [2024-11-18 14:21:31.145866] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:7740398493674204011 len:27500 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.196 [2024-11-18 14:21:31.145885] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.196 #20 NEW cov: 12537 ft: 14268 corp: 8/606b lim: 120 exec/s: 0 rss: 73Mb L: 79/120 MS: 1 InsertRepeatedBytes- 00:08:35.196 [2024-11-18 14:21:31.205883] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:7740398492046814059 len:27500 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.196 [2024-11-18 14:21:31.205910] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.196 [2024-11-18 14:21:31.205973] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:7740398493724535659 len:27500 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.196 [2024-11-18 14:21:31.205988] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.196 [2024-11-18 14:21:31.206042] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:7740398493674204011 len:27500 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.196 [2024-11-18 14:21:31.206056] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.196 #21 NEW cov: 12537 ft: 14324 corp: 9/685b lim: 120 exec/s: 0 rss: 73Mb L: 79/120 MS: 1 ShuffleBytes- 00:08:35.196 [2024-11-18 14:21:31.266050] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709497087 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.196 [2024-11-18 14:21:31.266077] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.196 [2024-11-18 14:21:31.266149] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.196 [2024-11-18 14:21:31.266165] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.196 [2024-11-18 14:21:31.266219] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.196 [2024-11-18 14:21:31.266236] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.196 #22 NEW cov: 12537 ft: 14437 corp: 10/769b lim: 120 exec/s: 0 rss: 73Mb L: 84/120 MS: 1 ChangeBit- 00:08:35.457 [2024-11-18 14:21:31.326249] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.457 [2024-11-18 14:21:31.326275] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.457 [2024-11-18 14:21:31.326311] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.457 [2024-11-18 14:21:31.326327] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.457 [2024-11-18 14:21:31.326382] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.457 [2024-11-18 14:21:31.326398] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.457 #23 NEW cov: 12537 ft: 14473 corp: 11/855b lim: 120 exec/s: 0 rss: 73Mb L: 86/120 MS: 1 CopyPart- 00:08:35.457 [2024-11-18 14:21:31.366508] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709497087 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.457 [2024-11-18 14:21:31.366534] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.457 [2024-11-18 14:21:31.366584] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:14395756207946581959 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.457 [2024-11-18 14:21:31.366600] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.457 [2024-11-18 14:21:31.366654] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.457 [2024-11-18 14:21:31.366670] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.457 [2024-11-18 14:21:31.366723] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.457 [2024-11-18 14:21:31.366737] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:35.457 #24 NEW cov: 12537 ft: 14544 corp: 12/952b lim: 120 exec/s: 0 rss: 73Mb L: 97/120 MS: 1 InsertRepeatedBytes- 00:08:35.457 [2024-11-18 14:21:31.406465] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18398330377715318783 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.457 [2024-11-18 14:21:31.406491] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.457 [2024-11-18 14:21:31.406557] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.457 [2024-11-18 14:21:31.406574] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.457 [2024-11-18 14:21:31.406630] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.457 [2024-11-18 14:21:31.406646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.457 #25 NEW cov: 12537 ft: 14590 corp: 13/1035b lim: 120 exec/s: 0 rss: 73Mb L: 83/120 MS: 1 ChangeBinInt- 00:08:35.457 [2024-11-18 14:21:31.446574] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:7740398492046814059 len:27500 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.457 [2024-11-18 14:21:31.446600] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.457 [2024-11-18 14:21:31.446667] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:7740398493674204011 len:27500 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.457 [2024-11-18 14:21:31.446683] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.457 [2024-11-18 14:21:31.446746] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:10778403160497810283 len:27500 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.457 [2024-11-18 14:21:31.446760] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.457 NEW_FUNC[1/1]: 0x1c52b68 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:35.457 #26 NEW cov: 12560 ft: 14625 corp: 14/1107b lim: 120 exec/s: 0 rss: 73Mb L: 72/120 MS: 1 ChangeBinInt- 00:08:35.457 [2024-11-18 14:21:31.486705] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:7740398492046814059 len:27500 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.457 [2024-11-18 14:21:31.486731] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.457 [2024-11-18 14:21:31.486766] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:7740398493724535659 len:27500 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.457 [2024-11-18 14:21:31.486786] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.457 [2024-11-18 14:21:31.486839] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:7740398493674204011 len:27500 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.457 [2024-11-18 14:21:31.486853] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.458 #27 NEW cov: 12560 ft: 14649 corp: 15/1186b lim: 120 exec/s: 0 rss: 73Mb L: 79/120 MS: 1 ChangeBit- 00:08:35.458 [2024-11-18 14:21:31.546848] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:7740398492046814059 len:27500 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.458 [2024-11-18 14:21:31.546874] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.458 [2024-11-18 14:21:31.546926] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:7740398493724535659 len:27500 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.458 [2024-11-18 14:21:31.546943] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.458 [2024-11-18 14:21:31.546997] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:7740398493669485419 len:27500 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.458 [2024-11-18 14:21:31.547012] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.458 #28 NEW cov: 12560 ft: 14736 corp: 16/1265b lim: 120 exec/s: 28 rss: 73Mb L: 79/120 MS: 1 ChangeByte- 00:08:35.719 [2024-11-18 14:21:31.586985] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18398330377715318783 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.719 [2024-11-18 14:21:31.587010] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.719 [2024-11-18 14:21:31.587072] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.719 [2024-11-18 14:21:31.587088] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.719 [2024-11-18 14:21:31.587143] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.719 [2024-11-18 14:21:31.587159] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.719 #29 NEW cov: 12560 ft: 14748 corp: 17/1348b lim: 120 exec/s: 29 rss: 73Mb L: 83/120 MS: 1 CopyPart- 00:08:35.719 [2024-11-18 14:21:31.647301] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709497087 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.719 [2024-11-18 14:21:31.647328] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.719 [2024-11-18 14:21:31.647379] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:14395756207946581959 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.719 [2024-11-18 14:21:31.647395] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.719 [2024-11-18 14:21:31.647450] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.719 [2024-11-18 14:21:31.647467] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.719 [2024-11-18 14:21:31.647522] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.719 [2024-11-18 14:21:31.647542] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:35.719 #30 NEW cov: 12560 ft: 14775 corp: 18/1445b lim: 120 exec/s: 30 rss: 73Mb L: 97/120 MS: 1 ChangeBinInt- 00:08:35.719 [2024-11-18 14:21:31.707328] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.719 [2024-11-18 14:21:31.707355] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.719 [2024-11-18 14:21:31.707396] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446696794709557247 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.719 [2024-11-18 14:21:31.707411] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.719 [2024-11-18 14:21:31.707466] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.719 [2024-11-18 14:21:31.707483] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.719 #31 NEW cov: 12560 ft: 14804 corp: 19/1529b lim: 120 exec/s: 31 rss: 73Mb L: 84/120 MS: 1 InsertByte- 00:08:35.719 [2024-11-18 14:21:31.747415] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18398330377715318783 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.719 [2024-11-18 14:21:31.747441] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.719 [2024-11-18 14:21:31.747487] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.719 [2024-11-18 14:21:31.747502] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.719 [2024-11-18 14:21:31.747561] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.719 [2024-11-18 14:21:31.747578] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.719 #32 NEW cov: 12560 ft: 14831 corp: 20/1620b lim: 120 exec/s: 32 rss: 73Mb L: 91/120 MS: 1 CMP- DE: "\003\315\365'\271\213\213\000"- 00:08:35.719 [2024-11-18 14:21:31.787541] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.719 [2024-11-18 14:21:31.787572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.719 [2024-11-18 14:21:31.787644] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446696794709557247 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.719 [2024-11-18 14:21:31.787658] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.719 [2024-11-18 14:21:31.787724] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.719 [2024-11-18 14:21:31.787739] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.719 #33 NEW cov: 12560 ft: 14888 corp: 21/1704b lim: 120 exec/s: 33 rss: 73Mb L: 84/120 MS: 1 ChangeBit- 00:08:35.981 [2024-11-18 14:21:31.847746] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:7740398492046814059 len:27500 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.981 [2024-11-18 14:21:31.847773] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.981 [2024-11-18 14:21:31.847824] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:7740398480839633771 len:27500 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.981 [2024-11-18 14:21:31.847841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.981 [2024-11-18 14:21:31.847896] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:7740398493674204011 len:27500 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.981 [2024-11-18 14:21:31.847912] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.981 #34 NEW cov: 12560 ft: 14911 corp: 22/1783b lim: 120 exec/s: 34 rss: 73Mb L: 79/120 MS: 1 ChangeBinInt- 00:08:35.981 [2024-11-18 14:21:31.907935] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709497087 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.981 [2024-11-18 14:21:31.907961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.981 [2024-11-18 14:21:31.907998] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.981 [2024-11-18 14:21:31.908014] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.981 [2024-11-18 14:21:31.908067] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.981 [2024-11-18 14:21:31.908082] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.981 #35 NEW cov: 12560 ft: 14922 corp: 23/1866b lim: 120 exec/s: 35 rss: 73Mb L: 83/120 MS: 1 CopyPart- 00:08:35.981 [2024-11-18 14:21:31.948009] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:7740398492046814059 len:27500 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.981 [2024-11-18 14:21:31.948035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.981 [2024-11-18 14:21:31.948083] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:7740398493674204011 len:27500 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.981 [2024-11-18 14:21:31.948099] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.981 [2024-11-18 14:21:31.948152] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:7740398493674204011 len:27500 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.981 [2024-11-18 14:21:31.948168] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.981 #36 NEW cov: 12560 ft: 14930 corp: 24/1939b lim: 120 exec/s: 36 rss: 73Mb L: 73/120 MS: 1 InsertByte- 00:08:35.981 [2024-11-18 14:21:31.988123] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.981 [2024-11-18 14:21:31.988149] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.981 [2024-11-18 14:21:31.988213] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65282 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.981 [2024-11-18 14:21:31.988229] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.981 [2024-11-18 14:21:31.988281] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.981 [2024-11-18 14:21:31.988296] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.981 #37 NEW cov: 12560 ft: 14994 corp: 25/2022b lim: 120 exec/s: 37 rss: 73Mb L: 83/120 MS: 1 ChangeBinInt- 00:08:35.981 [2024-11-18 14:21:32.028224] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:7740398492046814059 len:31340 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.981 [2024-11-18 14:21:32.028250] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.981 [2024-11-18 14:21:32.028298] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:7740398493674204011 len:27500 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.981 [2024-11-18 14:21:32.028315] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.981 [2024-11-18 14:21:32.028364] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:7740398493674204011 len:27500 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.981 [2024-11-18 14:21:32.028380] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.981 #38 NEW cov: 12560 ft: 15004 corp: 26/2095b lim: 120 exec/s: 38 rss: 73Mb L: 73/120 MS: 1 InsertByte- 00:08:35.981 [2024-11-18 14:21:32.068389] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:7740398492476664683 len:27500 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.981 [2024-11-18 14:21:32.068415] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.981 [2024-11-18 14:21:32.068479] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:7740398493724732267 len:27500 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.981 [2024-11-18 14:21:32.068495] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.981 [2024-11-18 14:21:32.068554] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:7740398493674204011 len:27500 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.981 [2024-11-18 14:21:32.068570] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.242 #39 NEW cov: 12560 ft: 15046 corp: 27/2175b lim: 120 exec/s: 39 rss: 74Mb L: 80/120 MS: 1 InsertByte- 00:08:36.242 [2024-11-18 14:21:32.128514] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709497087 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.242 [2024-11-18 14:21:32.128541] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.242 [2024-11-18 14:21:32.128598] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.242 [2024-11-18 14:21:32.128615] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.242 [2024-11-18 14:21:32.128668] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.242 [2024-11-18 14:21:32.128684] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.242 #40 NEW cov: 12560 ft: 15069 corp: 28/2258b lim: 120 exec/s: 40 rss: 74Mb L: 83/120 MS: 1 EraseBytes- 00:08:36.242 [2024-11-18 14:21:32.188674] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:7740398492046814059 len:27500 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.242 [2024-11-18 14:21:32.188700] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.242 [2024-11-18 14:21:32.188763] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:7740398493674204011 len:27500 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.242 [2024-11-18 14:21:32.188782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.242 [2024-11-18 14:21:32.188835] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:7740398493674204011 len:27500 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.242 [2024-11-18 14:21:32.188850] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.242 #41 NEW cov: 12560 ft: 15086 corp: 29/2331b lim: 120 exec/s: 41 rss: 74Mb L: 73/120 MS: 1 CopyPart- 00:08:36.242 [2024-11-18 14:21:32.248841] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709497087 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.242 [2024-11-18 14:21:32.248867] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.242 [2024-11-18 14:21:32.248917] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.242 [2024-11-18 14:21:32.248933] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.242 [2024-11-18 14:21:32.248986] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.242 [2024-11-18 14:21:32.249019] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.242 #42 NEW cov: 12560 ft: 15107 corp: 30/2425b lim: 120 exec/s: 42 rss: 74Mb L: 94/120 MS: 1 InsertRepeatedBytes- 00:08:36.242 [2024-11-18 14:21:32.288818] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709497087 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.242 [2024-11-18 14:21:32.288844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.243 [2024-11-18 14:21:32.288883] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.243 [2024-11-18 14:21:32.288899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.243 #43 NEW cov: 12560 ft: 15438 corp: 31/2492b lim: 120 exec/s: 43 rss: 74Mb L: 67/120 MS: 1 EraseBytes- 00:08:36.243 [2024-11-18 14:21:32.349151] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18398330377715318783 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.243 [2024-11-18 14:21:32.349176] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.243 [2024-11-18 14:21:32.349230] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.243 [2024-11-18 14:21:32.349245] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.243 [2024-11-18 14:21:32.349298] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551432 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.243 [2024-11-18 14:21:32.349313] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.243 #44 NEW cov: 12560 ft: 15454 corp: 32/2575b lim: 120 exec/s: 44 rss: 74Mb L: 83/120 MS: 1 ChangeByte- 00:08:36.504 [2024-11-18 14:21:32.389257] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709497087 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.504 [2024-11-18 14:21:32.389282] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.504 [2024-11-18 14:21:32.389337] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.504 [2024-11-18 14:21:32.389352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.504 [2024-11-18 14:21:32.389404] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.504 [2024-11-18 14:21:32.389419] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.504 #45 NEW cov: 12560 ft: 15464 corp: 33/2667b lim: 120 exec/s: 45 rss: 74Mb L: 92/120 MS: 1 CMP- DE: "aN\023\340\215\177\000\000"- 00:08:36.504 [2024-11-18 14:21:32.449441] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:750811870367804267 len:27500 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.504 [2024-11-18 14:21:32.449466] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.504 [2024-11-18 14:21:32.449529] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:7956571275838516846 len:27500 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.504 [2024-11-18 14:21:32.449545] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.504 [2024-11-18 14:21:32.449606] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:7740398493674204011 len:27500 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.504 [2024-11-18 14:21:32.449622] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.504 #46 NEW cov: 12560 ft: 15510 corp: 34/2760b lim: 120 exec/s: 46 rss: 74Mb L: 93/120 MS: 1 CrossOver- 00:08:36.504 [2024-11-18 14:21:32.489540] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.504 [2024-11-18 14:21:32.489569] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.504 [2024-11-18 14:21:32.489640] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446696794709557247 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.504 [2024-11-18 14:21:32.489657] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.504 [2024-11-18 14:21:32.489710] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.504 [2024-11-18 14:21:32.489737] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.504 #47 NEW cov: 12560 ft: 15522 corp: 35/2844b lim: 120 exec/s: 47 rss: 74Mb L: 84/120 MS: 1 CopyPart- 00:08:36.504 [2024-11-18 14:21:32.549754] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709497087 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.504 [2024-11-18 14:21:32.549779] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.504 [2024-11-18 14:21:32.549815] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.504 [2024-11-18 14:21:32.549831] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.504 [2024-11-18 14:21:32.549885] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.504 [2024-11-18 14:21:32.549901] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.504 #48 NEW cov: 12560 ft: 15523 corp: 36/2928b lim: 120 exec/s: 24 rss: 74Mb L: 84/120 MS: 1 ChangeByte- 00:08:36.504 #48 DONE cov: 12560 ft: 15523 corp: 36/2928b lim: 120 exec/s: 24 rss: 74Mb 00:08:36.504 ###### Recommended dictionary. ###### 00:08:36.504 "\003\315\365'\271\213\213\000" # Uses: 0 00:08:36.504 "aN\023\340\215\177\000\000" # Uses: 0 00:08:36.504 ###### End of recommended dictionary. ###### 00:08:36.504 Done 48 runs in 2 second(s) 00:08:36.765 14:21:32 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_17.conf /var/tmp/suppress_nvmf_fuzz 00:08:36.765 14:21:32 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:36.765 14:21:32 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:36.765 14:21:32 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 18 1 0x1 00:08:36.765 14:21:32 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=18 00:08:36.765 14:21:32 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:36.765 14:21:32 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:36.765 14:21:32 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:08:36.765 14:21:32 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_18.conf 00:08:36.765 14:21:32 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:36.765 14:21:32 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:36.765 14:21:32 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 18 00:08:36.765 14:21:32 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4418 00:08:36.765 14:21:32 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:08:36.765 14:21:32 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4418' 00:08:36.765 14:21:32 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4418"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:36.765 14:21:32 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:36.765 14:21:32 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:36.765 14:21:32 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4418' -c /tmp/fuzz_json_18.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 -Z 18 00:08:36.765 [2024-11-18 14:21:32.717137] Starting SPDK v25.01-pre git sha1 d47eb51c9 / DPDK 23.11.0 initialization... 00:08:36.765 [2024-11-18 14:21:32.717232] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid330481 ] 00:08:37.026 [2024-11-18 14:21:32.996911] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:37.026 [2024-11-18 14:21:33.019286] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:37.026 [2024-11-18 14:21:33.071687] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:37.026 [2024-11-18 14:21:33.088019] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4418 *** 00:08:37.026 INFO: Running with entropic power schedule (0xFF, 100). 00:08:37.026 INFO: Seed: 1394013998 00:08:37.026 INFO: Loaded 1 modules (387495 inline 8-bit counters): 387495 [0x2acbe4c, 0x2b2a7f3), 00:08:37.026 INFO: Loaded 1 PC tables (387495 PCs): 387495 [0x2b2a7f8,0x3114268), 00:08:37.026 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:08:37.026 INFO: A corpus is not provided, starting from an empty corpus 00:08:37.026 #2 INITED exec/s: 0 rss: 64Mb 00:08:37.026 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:37.026 This may also happen if the target rejected all inputs we tried so far 00:08:37.026 [2024-11-18 14:21:33.143216] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:37.026 [2024-11-18 14:21:33.143242] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.547 NEW_FUNC[1/715]: 0x476df8 in fuzz_nvm_write_zeroes_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:562 00:08:37.547 NEW_FUNC[2/715]: 0x495e28 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:37.547 #3 NEW cov: 12276 ft: 12269 corp: 2/34b lim: 100 exec/s: 0 rss: 72Mb L: 33/33 MS: 1 InsertRepeatedBytes- 00:08:37.547 [2024-11-18 14:21:33.474511] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:37.547 [2024-11-18 14:21:33.474605] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.547 #9 NEW cov: 12389 ft: 13015 corp: 3/67b lim: 100 exec/s: 0 rss: 72Mb L: 33/33 MS: 1 ShuffleBytes- 00:08:37.547 [2024-11-18 14:21:33.544162] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:37.547 [2024-11-18 14:21:33.544190] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.547 #10 NEW cov: 12395 ft: 13376 corp: 4/101b lim: 100 exec/s: 0 rss: 72Mb L: 34/34 MS: 1 InsertByte- 00:08:37.547 [2024-11-18 14:21:33.584266] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:37.547 [2024-11-18 14:21:33.584291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.547 #11 NEW cov: 12480 ft: 13625 corp: 5/127b lim: 100 exec/s: 0 rss: 72Mb L: 26/34 MS: 1 EraseBytes- 00:08:37.547 [2024-11-18 14:21:33.644435] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:37.547 [2024-11-18 14:21:33.644461] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.807 #12 NEW cov: 12480 ft: 13771 corp: 6/161b lim: 100 exec/s: 0 rss: 72Mb L: 34/34 MS: 1 InsertByte- 00:08:37.807 [2024-11-18 14:21:33.704573] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:37.807 [2024-11-18 14:21:33.704598] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.807 #13 NEW cov: 12480 ft: 13882 corp: 7/194b lim: 100 exec/s: 0 rss: 72Mb L: 33/34 MS: 1 ChangeBinInt- 00:08:37.807 [2024-11-18 14:21:33.744694] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:37.807 [2024-11-18 14:21:33.744720] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.807 #14 NEW cov: 12480 ft: 13995 corp: 8/227b lim: 100 exec/s: 0 rss: 72Mb L: 33/34 MS: 1 CopyPart- 00:08:37.807 [2024-11-18 14:21:33.784822] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:37.807 [2024-11-18 14:21:33.784846] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.807 #15 NEW cov: 12480 ft: 14059 corp: 9/254b lim: 100 exec/s: 0 rss: 72Mb L: 27/34 MS: 1 EraseBytes- 00:08:37.807 [2024-11-18 14:21:33.824885] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:37.807 [2024-11-18 14:21:33.824909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.807 #16 NEW cov: 12480 ft: 14080 corp: 10/278b lim: 100 exec/s: 0 rss: 72Mb L: 24/34 MS: 1 EraseBytes- 00:08:37.807 [2024-11-18 14:21:33.865038] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:37.807 [2024-11-18 14:21:33.865063] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.807 #17 NEW cov: 12480 ft: 14203 corp: 11/305b lim: 100 exec/s: 0 rss: 72Mb L: 27/34 MS: 1 ShuffleBytes- 00:08:37.807 [2024-11-18 14:21:33.925424] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:37.807 [2024-11-18 14:21:33.925448] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.807 [2024-11-18 14:21:33.925501] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:37.807 [2024-11-18 14:21:33.925515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.807 [2024-11-18 14:21:33.925577] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:37.807 [2024-11-18 14:21:33.925593] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.068 #18 NEW cov: 12480 ft: 14639 corp: 12/384b lim: 100 exec/s: 0 rss: 72Mb L: 79/79 MS: 1 InsertRepeatedBytes- 00:08:38.068 [2024-11-18 14:21:33.985467] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:38.068 [2024-11-18 14:21:33.985492] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.068 [2024-11-18 14:21:33.985546] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:38.068 [2024-11-18 14:21:33.985565] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.068 #19 NEW cov: 12480 ft: 14897 corp: 13/425b lim: 100 exec/s: 0 rss: 72Mb L: 41/79 MS: 1 InsertRepeatedBytes- 00:08:38.068 [2024-11-18 14:21:34.025610] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:38.068 [2024-11-18 14:21:34.025633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.068 [2024-11-18 14:21:34.025684] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:38.068 [2024-11-18 14:21:34.025699] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.068 NEW_FUNC[1/1]: 0x1c52b68 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:38.068 #20 NEW cov: 12503 ft: 14953 corp: 14/481b lim: 100 exec/s: 0 rss: 73Mb L: 56/79 MS: 1 InsertRepeatedBytes- 00:08:38.068 [2024-11-18 14:21:34.085876] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:38.068 [2024-11-18 14:21:34.085901] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.068 [2024-11-18 14:21:34.085947] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:38.068 [2024-11-18 14:21:34.085962] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.068 [2024-11-18 14:21:34.086035] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:38.068 [2024-11-18 14:21:34.086051] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.068 #21 NEW cov: 12503 ft: 15045 corp: 15/553b lim: 100 exec/s: 0 rss: 73Mb L: 72/79 MS: 1 CrossOver- 00:08:38.068 [2024-11-18 14:21:34.145827] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:38.068 [2024-11-18 14:21:34.145852] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.068 #32 NEW cov: 12503 ft: 15103 corp: 16/583b lim: 100 exec/s: 32 rss: 73Mb L: 30/79 MS: 1 CopyPart- 00:08:38.329 [2024-11-18 14:21:34.206014] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:38.329 [2024-11-18 14:21:34.206042] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.329 #33 NEW cov: 12503 ft: 15116 corp: 17/617b lim: 100 exec/s: 33 rss: 73Mb L: 34/79 MS: 1 InsertByte- 00:08:38.329 [2024-11-18 14:21:34.266183] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:38.329 [2024-11-18 14:21:34.266209] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.329 #34 NEW cov: 12503 ft: 15167 corp: 18/650b lim: 100 exec/s: 34 rss: 73Mb L: 33/79 MS: 1 ChangeBit- 00:08:38.329 [2024-11-18 14:21:34.306513] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:38.329 [2024-11-18 14:21:34.306538] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.329 [2024-11-18 14:21:34.306594] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:38.329 [2024-11-18 14:21:34.306610] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.329 [2024-11-18 14:21:34.306665] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:38.329 [2024-11-18 14:21:34.306677] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.329 #35 NEW cov: 12503 ft: 15182 corp: 19/722b lim: 100 exec/s: 35 rss: 73Mb L: 72/79 MS: 1 ShuffleBytes- 00:08:38.329 [2024-11-18 14:21:34.366469] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:38.329 [2024-11-18 14:21:34.366495] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.329 #36 NEW cov: 12503 ft: 15198 corp: 20/755b lim: 100 exec/s: 36 rss: 73Mb L: 33/79 MS: 1 CrossOver- 00:08:38.329 [2024-11-18 14:21:34.426644] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:38.329 [2024-11-18 14:21:34.426669] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.329 #37 NEW cov: 12503 ft: 15219 corp: 21/788b lim: 100 exec/s: 37 rss: 73Mb L: 33/79 MS: 1 ShuffleBytes- 00:08:38.590 [2024-11-18 14:21:34.466864] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:38.590 [2024-11-18 14:21:34.466890] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.590 [2024-11-18 14:21:34.466942] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:38.590 [2024-11-18 14:21:34.466955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.590 #38 NEW cov: 12503 ft: 15225 corp: 22/830b lim: 100 exec/s: 38 rss: 73Mb L: 42/79 MS: 1 InsertRepeatedBytes- 00:08:38.590 [2024-11-18 14:21:34.506868] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:38.590 [2024-11-18 14:21:34.506894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.590 #39 NEW cov: 12503 ft: 15263 corp: 23/864b lim: 100 exec/s: 39 rss: 73Mb L: 34/79 MS: 1 ChangeByte- 00:08:38.590 [2024-11-18 14:21:34.567043] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:38.590 [2024-11-18 14:21:34.567068] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.590 #40 NEW cov: 12503 ft: 15284 corp: 24/890b lim: 100 exec/s: 40 rss: 73Mb L: 26/79 MS: 1 ShuffleBytes- 00:08:38.590 [2024-11-18 14:21:34.607145] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:38.590 [2024-11-18 14:21:34.607169] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.590 #41 NEW cov: 12503 ft: 15309 corp: 25/925b lim: 100 exec/s: 41 rss: 73Mb L: 35/79 MS: 1 InsertByte- 00:08:38.590 [2024-11-18 14:21:34.667545] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:38.590 [2024-11-18 14:21:34.667573] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.590 [2024-11-18 14:21:34.667654] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:38.590 [2024-11-18 14:21:34.667668] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.590 [2024-11-18 14:21:34.667736] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:38.590 [2024-11-18 14:21:34.667750] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.590 #42 NEW cov: 12503 ft: 15321 corp: 26/998b lim: 100 exec/s: 42 rss: 73Mb L: 73/79 MS: 1 InsertByte- 00:08:38.851 [2024-11-18 14:21:34.727714] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:38.851 [2024-11-18 14:21:34.727741] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.851 [2024-11-18 14:21:34.727789] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:38.851 [2024-11-18 14:21:34.727804] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.851 [2024-11-18 14:21:34.727857] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:38.851 [2024-11-18 14:21:34.727873] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.851 #43 NEW cov: 12503 ft: 15400 corp: 27/1077b lim: 100 exec/s: 43 rss: 74Mb L: 79/79 MS: 1 ChangeByte- 00:08:38.851 [2024-11-18 14:21:34.787682] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:38.851 [2024-11-18 14:21:34.787709] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.851 #44 NEW cov: 12503 ft: 15420 corp: 28/1101b lim: 100 exec/s: 44 rss: 74Mb L: 24/79 MS: 1 EraseBytes- 00:08:38.851 [2024-11-18 14:21:34.827800] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:38.851 [2024-11-18 14:21:34.827826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.851 #45 NEW cov: 12503 ft: 15426 corp: 29/1136b lim: 100 exec/s: 45 rss: 74Mb L: 35/79 MS: 1 InsertByte- 00:08:38.851 [2024-11-18 14:21:34.867920] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:38.851 [2024-11-18 14:21:34.867944] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.851 #46 NEW cov: 12503 ft: 15429 corp: 30/1170b lim: 100 exec/s: 46 rss: 74Mb L: 34/79 MS: 1 InsertByte- 00:08:38.851 [2024-11-18 14:21:34.908004] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:38.851 [2024-11-18 14:21:34.908028] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.851 #47 NEW cov: 12503 ft: 15442 corp: 31/1204b lim: 100 exec/s: 47 rss: 74Mb L: 34/79 MS: 1 CrossOver- 00:08:38.851 [2024-11-18 14:21:34.948147] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:38.851 [2024-11-18 14:21:34.948172] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.851 #48 NEW cov: 12503 ft: 15502 corp: 32/1238b lim: 100 exec/s: 48 rss: 74Mb L: 34/79 MS: 1 ShuffleBytes- 00:08:39.111 [2024-11-18 14:21:34.988249] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:39.111 [2024-11-18 14:21:34.988275] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.111 #49 NEW cov: 12503 ft: 15528 corp: 33/1271b lim: 100 exec/s: 49 rss: 74Mb L: 33/79 MS: 1 CopyPart- 00:08:39.111 [2024-11-18 14:21:35.048665] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:39.111 [2024-11-18 14:21:35.048689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.111 [2024-11-18 14:21:35.048744] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:39.111 [2024-11-18 14:21:35.048759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.111 [2024-11-18 14:21:35.048813] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:39.111 [2024-11-18 14:21:35.048827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:39.111 #50 NEW cov: 12503 ft: 15538 corp: 34/1335b lim: 100 exec/s: 50 rss: 74Mb L: 64/79 MS: 1 CrossOver- 00:08:39.111 [2024-11-18 14:21:35.108563] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:39.111 [2024-11-18 14:21:35.108588] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.111 #51 NEW cov: 12503 ft: 15540 corp: 35/1365b lim: 100 exec/s: 25 rss: 74Mb L: 30/79 MS: 1 ChangeByte- 00:08:39.111 #51 DONE cov: 12503 ft: 15540 corp: 35/1365b lim: 100 exec/s: 25 rss: 74Mb 00:08:39.111 Done 51 runs in 2 second(s) 00:08:39.372 14:21:35 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_18.conf /var/tmp/suppress_nvmf_fuzz 00:08:39.372 14:21:35 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:39.372 14:21:35 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:39.372 14:21:35 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 19 1 0x1 00:08:39.372 14:21:35 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=19 00:08:39.372 14:21:35 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:39.372 14:21:35 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:39.372 14:21:35 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:08:39.372 14:21:35 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_19.conf 00:08:39.372 14:21:35 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:39.372 14:21:35 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:39.372 14:21:35 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 19 00:08:39.372 14:21:35 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4419 00:08:39.372 14:21:35 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:08:39.372 14:21:35 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4419' 00:08:39.372 14:21:35 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4419"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:39.372 14:21:35 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:39.372 14:21:35 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:39.372 14:21:35 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4419' -c /tmp/fuzz_json_19.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 -Z 19 00:08:39.372 [2024-11-18 14:21:35.298131] Starting SPDK v25.01-pre git sha1 d47eb51c9 / DPDK 23.11.0 initialization... 00:08:39.372 [2024-11-18 14:21:35.298204] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid331010 ] 00:08:39.632 [2024-11-18 14:21:35.575015] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:39.632 [2024-11-18 14:21:35.593625] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:39.632 [2024-11-18 14:21:35.646100] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:39.632 [2024-11-18 14:21:35.662428] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4419 *** 00:08:39.632 INFO: Running with entropic power schedule (0xFF, 100). 00:08:39.632 INFO: Seed: 3966995800 00:08:39.632 INFO: Loaded 1 modules (387495 inline 8-bit counters): 387495 [0x2acbe4c, 0x2b2a7f3), 00:08:39.632 INFO: Loaded 1 PC tables (387495 PCs): 387495 [0x2b2a7f8,0x3114268), 00:08:39.632 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:08:39.632 INFO: A corpus is not provided, starting from an empty corpus 00:08:39.632 #2 INITED exec/s: 0 rss: 64Mb 00:08:39.632 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:39.632 This may also happen if the target rejected all inputs we tried so far 00:08:39.632 [2024-11-18 14:21:35.720938] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:13455272147882261178 len:47627 00:08:39.633 [2024-11-18 14:21:35.720968] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.153 NEW_FUNC[1/715]: 0x479db8 in fuzz_nvm_write_uncorrectable_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:582 00:08:40.153 NEW_FUNC[2/715]: 0x495e28 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:40.153 #11 NEW cov: 12253 ft: 12253 corp: 2/11b lim: 50 exec/s: 0 rss: 72Mb L: 10/10 MS: 4 CrossOver-ShuffleBytes-ShuffleBytes-InsertRepeatedBytes- 00:08:40.153 [2024-11-18 14:21:36.052084] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:7595718146404215145 len:26986 00:08:40.153 [2024-11-18 14:21:36.052141] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.153 [2024-11-18 14:21:36.052223] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:7595718147998050665 len:26986 00:08:40.153 [2024-11-18 14:21:36.052254] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.153 #15 NEW cov: 12367 ft: 13352 corp: 3/32b lim: 50 exec/s: 0 rss: 72Mb L: 21/21 MS: 4 CrossOver-CrossOver-CrossOver-InsertRepeatedBytes- 00:08:40.153 [2024-11-18 14:21:36.101881] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:7595718146404215145 len:26986 00:08:40.153 [2024-11-18 14:21:36.101909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.153 #16 NEW cov: 12373 ft: 13610 corp: 4/44b lim: 50 exec/s: 0 rss: 72Mb L: 12/21 MS: 1 EraseBytes- 00:08:40.153 [2024-11-18 14:21:36.162057] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:7595718146404215145 len:26986 00:08:40.153 [2024-11-18 14:21:36.162084] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.153 #17 NEW cov: 12458 ft: 13814 corp: 5/61b lim: 50 exec/s: 0 rss: 72Mb L: 17/21 MS: 1 CopyPart- 00:08:40.153 [2024-11-18 14:21:36.222326] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:12080808863958804391 len:42920 00:08:40.153 [2024-11-18 14:21:36.222357] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.153 [2024-11-18 14:21:36.222428] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:12080808863958804391 len:42920 00:08:40.153 [2024-11-18 14:21:36.222445] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.153 #19 NEW cov: 12458 ft: 14017 corp: 6/82b lim: 50 exec/s: 0 rss: 72Mb L: 21/21 MS: 2 ChangeByte-InsertRepeatedBytes- 00:08:40.153 [2024-11-18 14:21:36.262441] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:19198678493440 len:22831 00:08:40.153 [2024-11-18 14:21:36.262469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.153 [2024-11-18 14:21:36.262522] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:7595718146236442985 len:26986 00:08:40.153 [2024-11-18 14:21:36.262537] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.414 #20 NEW cov: 12458 ft: 14147 corp: 7/103b lim: 50 exec/s: 0 rss: 72Mb L: 21/21 MS: 1 CMP- DE: "\000\000\000\021vY.\000"- 00:08:40.414 [2024-11-18 14:21:36.302426] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18377617133426180095 len:26986 00:08:40.414 [2024-11-18 14:21:36.302454] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.414 #21 NEW cov: 12458 ft: 14200 corp: 8/120b lim: 50 exec/s: 0 rss: 72Mb L: 17/21 MS: 1 InsertRepeatedBytes- 00:08:40.414 [2024-11-18 14:21:36.342669] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:7595718146404215145 len:26986 00:08:40.414 [2024-11-18 14:21:36.342696] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.414 [2024-11-18 14:21:36.342734] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:7595718147998050665 len:26986 00:08:40.414 [2024-11-18 14:21:36.342750] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.414 #22 NEW cov: 12458 ft: 14231 corp: 9/146b lim: 50 exec/s: 0 rss: 72Mb L: 26/26 MS: 1 CopyPart- 00:08:40.414 [2024-11-18 14:21:36.382777] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:7595718146404215145 len:26986 00:08:40.414 [2024-11-18 14:21:36.382803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.414 [2024-11-18 14:21:36.382839] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:7595718147998050665 len:26986 00:08:40.414 [2024-11-18 14:21:36.382853] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.414 #23 NEW cov: 12458 ft: 14308 corp: 10/167b lim: 50 exec/s: 0 rss: 72Mb L: 21/26 MS: 1 ShuffleBytes- 00:08:40.414 [2024-11-18 14:21:36.422755] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:1258291200167772160 len:106 00:08:40.414 [2024-11-18 14:21:36.422783] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.414 #24 NEW cov: 12458 ft: 14385 corp: 11/184b lim: 50 exec/s: 0 rss: 72Mb L: 17/26 MS: 1 PersAutoDict- DE: "\000\000\000\021vY.\000"- 00:08:40.414 [2024-11-18 14:21:36.482892] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:7595718146404215145 len:26986 00:08:40.414 [2024-11-18 14:21:36.482919] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.414 #25 NEW cov: 12458 ft: 14438 corp: 12/197b lim: 50 exec/s: 0 rss: 72Mb L: 13/26 MS: 1 EraseBytes- 00:08:40.414 [2024-11-18 14:21:36.523035] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:7595718146404215145 len:26986 00:08:40.414 [2024-11-18 14:21:36.523061] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.675 #26 NEW cov: 12458 ft: 14448 corp: 13/211b lim: 50 exec/s: 0 rss: 72Mb L: 14/26 MS: 1 CopyPart- 00:08:40.675 [2024-11-18 14:21:36.563254] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:19198678493440 len:22831 00:08:40.675 [2024-11-18 14:21:36.563281] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.675 [2024-11-18 14:21:36.563331] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:7595718146236442985 len:18794 00:08:40.675 [2024-11-18 14:21:36.563347] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.675 NEW_FUNC[1/1]: 0x1c52b68 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:40.675 #27 NEW cov: 12481 ft: 14486 corp: 14/232b lim: 50 exec/s: 0 rss: 72Mb L: 21/26 MS: 1 ChangeBit- 00:08:40.675 [2024-11-18 14:21:36.623289] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:13455272147882261178 len:47627 00:08:40.675 [2024-11-18 14:21:36.623317] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.675 #28 NEW cov: 12481 ft: 14512 corp: 15/242b lim: 50 exec/s: 0 rss: 72Mb L: 10/26 MS: 1 ShuffleBytes- 00:08:40.675 [2024-11-18 14:21:36.683618] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18782066665728 len:22831 00:08:40.675 [2024-11-18 14:21:36.683645] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.675 [2024-11-18 14:21:36.683693] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:7595718146236442985 len:18794 00:08:40.675 [2024-11-18 14:21:36.683709] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.675 #29 NEW cov: 12481 ft: 14557 corp: 16/263b lim: 50 exec/s: 29 rss: 73Mb L: 21/26 MS: 1 ChangeBinInt- 00:08:40.675 [2024-11-18 14:21:36.743861] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:19198678493440 len:22831 00:08:40.675 [2024-11-18 14:21:36.743888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.675 [2024-11-18 14:21:36.743945] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:3834029159532292405 len:13622 00:08:40.675 [2024-11-18 14:21:36.743960] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.675 [2024-11-18 14:21:36.744013] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:3834029160418063669 len:13674 00:08:40.675 [2024-11-18 14:21:36.744027] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:40.675 #30 NEW cov: 12481 ft: 14849 corp: 17/300b lim: 50 exec/s: 30 rss: 73Mb L: 37/37 MS: 1 InsertRepeatedBytes- 00:08:40.675 [2024-11-18 14:21:36.783843] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:7595718146404215145 len:26986 00:08:40.675 [2024-11-18 14:21:36.783870] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.675 [2024-11-18 14:21:36.783935] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:7595718147998050665 len:26986 00:08:40.675 [2024-11-18 14:21:36.783954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.936 #31 NEW cov: 12481 ft: 14871 corp: 18/321b lim: 50 exec/s: 31 rss: 73Mb L: 21/37 MS: 1 ShuffleBytes- 00:08:40.936 [2024-11-18 14:21:36.843949] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18404252802609053695 len:26986 00:08:40.936 [2024-11-18 14:21:36.843976] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.936 #32 NEW cov: 12481 ft: 14886 corp: 19/338b lim: 50 exec/s: 32 rss: 73Mb L: 17/37 MS: 1 ShuffleBytes- 00:08:40.936 [2024-11-18 14:21:36.904087] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:13455074536436972218 len:1 00:08:40.936 [2024-11-18 14:21:36.904115] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.936 #33 NEW cov: 12481 ft: 14888 corp: 20/352b lim: 50 exec/s: 33 rss: 73Mb L: 14/37 MS: 1 CMP- DE: "\007\000\000\000"- 00:08:40.936 [2024-11-18 14:21:36.944220] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:11427437004366375273 len:26986 00:08:40.936 [2024-11-18 14:21:36.944247] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.936 #34 NEW cov: 12481 ft: 14955 corp: 21/369b lim: 50 exec/s: 34 rss: 73Mb L: 17/37 MS: 1 ChangeBinInt- 00:08:40.936 [2024-11-18 14:21:36.984315] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:8527897946937622545 len:47627 00:08:40.936 [2024-11-18 14:21:36.984342] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.936 #35 NEW cov: 12481 ft: 14964 corp: 22/379b lim: 50 exec/s: 35 rss: 73Mb L: 10/37 MS: 1 PersAutoDict- DE: "\000\000\000\021vY.\000"- 00:08:40.936 [2024-11-18 14:21:37.044454] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:7595614341164957695 len:26986 00:08:40.936 [2024-11-18 14:21:37.044481] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.197 #36 NEW cov: 12481 ft: 14970 corp: 23/396b lim: 50 exec/s: 36 rss: 73Mb L: 17/37 MS: 1 ShuffleBytes- 00:08:41.197 [2024-11-18 14:21:37.104753] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:7595718146404215145 len:26986 00:08:41.197 [2024-11-18 14:21:37.104779] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.197 [2024-11-18 14:21:37.104819] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:7595718147998050665 len:26986 00:08:41.197 [2024-11-18 14:21:37.104835] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.197 #37 NEW cov: 12481 ft: 14988 corp: 24/418b lim: 50 exec/s: 37 rss: 73Mb L: 22/37 MS: 1 CrossOver- 00:08:41.197 [2024-11-18 14:21:37.144852] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:7595718150366598796 len:26986 00:08:41.197 [2024-11-18 14:21:37.144879] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.197 [2024-11-18 14:21:37.144941] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:7595718147998050665 len:26986 00:08:41.197 [2024-11-18 14:21:37.144957] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.197 #38 NEW cov: 12481 ft: 15004 corp: 25/444b lim: 50 exec/s: 38 rss: 73Mb L: 26/37 MS: 1 ChangeBinInt- 00:08:41.197 [2024-11-18 14:21:37.205012] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:750246712743821312 len:26986 00:08:41.197 [2024-11-18 14:21:37.205042] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.197 [2024-11-18 14:21:37.205108] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:7595718147998050665 len:26986 00:08:41.197 [2024-11-18 14:21:37.205124] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.197 #39 NEW cov: 12481 ft: 15018 corp: 26/469b lim: 50 exec/s: 39 rss: 73Mb L: 25/37 MS: 1 PersAutoDict- DE: "\007\000\000\000"- 00:08:41.197 [2024-11-18 14:21:37.265072] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:12080808863958804391 len:42920 00:08:41.197 [2024-11-18 14:21:37.265099] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.197 #40 NEW cov: 12481 ft: 15042 corp: 27/481b lim: 50 exec/s: 40 rss: 73Mb L: 12/37 MS: 1 EraseBytes- 00:08:41.457 [2024-11-18 14:21:37.325429] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:19198678493440 len:22792 00:08:41.457 [2024-11-18 14:21:37.325457] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.457 [2024-11-18 14:21:37.325511] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:29670772247101486 len:26986 00:08:41.457 [2024-11-18 14:21:37.325527] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.457 #41 NEW cov: 12481 ft: 15066 corp: 28/506b lim: 50 exec/s: 41 rss: 73Mb L: 25/37 MS: 1 PersAutoDict- DE: "\007\000\000\000"- 00:08:41.457 [2024-11-18 14:21:37.365482] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:7595718150366598796 len:26986 00:08:41.457 [2024-11-18 14:21:37.365509] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.457 [2024-11-18 14:21:37.365564] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:7595718147998089065 len:26986 00:08:41.457 [2024-11-18 14:21:37.365580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.457 #42 NEW cov: 12481 ft: 15078 corp: 29/532b lim: 50 exec/s: 42 rss: 74Mb L: 26/37 MS: 1 ChangeByte- 00:08:41.457 [2024-11-18 14:21:37.425760] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:26491532962048 len:6169 00:08:41.457 [2024-11-18 14:21:37.425786] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.458 [2024-11-18 14:21:37.425849] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:1736164148113840152 len:6169 00:08:41.458 [2024-11-18 14:21:37.425865] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.458 [2024-11-18 14:21:37.425917] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:3314765225840940377 len:26986 00:08:41.458 [2024-11-18 14:21:37.425933] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:41.458 #43 NEW cov: 12481 ft: 15087 corp: 30/568b lim: 50 exec/s: 43 rss: 74Mb L: 36/37 MS: 1 InsertRepeatedBytes- 00:08:41.458 [2024-11-18 14:21:37.485712] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:12080808863958804391 len:42879 00:08:41.458 [2024-11-18 14:21:37.485740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.458 #44 NEW cov: 12481 ft: 15101 corp: 31/580b lim: 50 exec/s: 44 rss: 74Mb L: 12/37 MS: 1 ChangeByte- 00:08:41.458 [2024-11-18 14:21:37.546064] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:7595718146404215145 len:26986 00:08:41.458 [2024-11-18 14:21:37.546092] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.458 [2024-11-18 14:21:37.546158] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:115901460973312 len:26986 00:08:41.458 [2024-11-18 14:21:37.546174] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.458 #45 NEW cov: 12481 ft: 15119 corp: 32/606b lim: 50 exec/s: 45 rss: 74Mb L: 26/37 MS: 1 PersAutoDict- DE: "\007\000\000\000"- 00:08:41.718 [2024-11-18 14:21:37.586039] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:13455272147882261178 len:47627 00:08:41.718 [2024-11-18 14:21:37.586068] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.718 #46 NEW cov: 12481 ft: 15218 corp: 33/616b lim: 50 exec/s: 46 rss: 74Mb L: 10/37 MS: 1 ShuffleBytes- 00:08:41.718 [2024-11-18 14:21:37.626203] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:19198678493440 len:22831 00:08:41.718 [2024-11-18 14:21:37.626229] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.718 [2024-11-18 14:21:37.626265] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:7595718146236442985 len:18794 00:08:41.718 [2024-11-18 14:21:37.626281] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.718 #47 NEW cov: 12481 ft: 15249 corp: 34/637b lim: 50 exec/s: 47 rss: 74Mb L: 21/37 MS: 1 ShuffleBytes- 00:08:41.718 [2024-11-18 14:21:37.666214] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:13455272146959514298 len:47803 00:08:41.718 [2024-11-18 14:21:37.666240] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.718 #48 NEW cov: 12481 ft: 15258 corp: 35/648b lim: 50 exec/s: 48 rss: 74Mb L: 11/37 MS: 1 InsertByte- 00:08:41.718 [2024-11-18 14:21:37.706339] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:12080808863958804391 len:42920 00:08:41.718 [2024-11-18 14:21:37.706366] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.719 #49 NEW cov: 12481 ft: 15262 corp: 36/664b lim: 50 exec/s: 24 rss: 74Mb L: 16/37 MS: 1 CopyPart- 00:08:41.719 #49 DONE cov: 12481 ft: 15262 corp: 36/664b lim: 50 exec/s: 24 rss: 74Mb 00:08:41.719 ###### Recommended dictionary. ###### 00:08:41.719 "\000\000\000\021vY.\000" # Uses: 2 00:08:41.719 "\007\000\000\000" # Uses: 3 00:08:41.719 ###### End of recommended dictionary. ###### 00:08:41.719 Done 49 runs in 2 second(s) 00:08:41.719 14:21:37 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_19.conf /var/tmp/suppress_nvmf_fuzz 00:08:41.979 14:21:37 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:41.979 14:21:37 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:41.979 14:21:37 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 20 1 0x1 00:08:41.979 14:21:37 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=20 00:08:41.979 14:21:37 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:41.979 14:21:37 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:41.979 14:21:37 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:08:41.979 14:21:37 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_20.conf 00:08:41.979 14:21:37 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:41.979 14:21:37 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:41.979 14:21:37 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 20 00:08:41.979 14:21:37 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4420 00:08:41.979 14:21:37 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:08:41.979 14:21:37 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4420' 00:08:41.979 14:21:37 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4420"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:41.979 14:21:37 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:41.979 14:21:37 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:41.980 14:21:37 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4420' -c /tmp/fuzz_json_20.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 -Z 20 00:08:41.980 [2024-11-18 14:21:37.889786] Starting SPDK v25.01-pre git sha1 d47eb51c9 / DPDK 23.11.0 initialization... 00:08:41.980 [2024-11-18 14:21:37.889859] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid331316 ] 00:08:42.240 [2024-11-18 14:21:38.165919] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:42.240 [2024-11-18 14:21:38.185776] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:42.240 [2024-11-18 14:21:38.238182] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:42.240 [2024-11-18 14:21:38.254505] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4420 *** 00:08:42.240 INFO: Running with entropic power schedule (0xFF, 100). 00:08:42.240 INFO: Seed: 2266047771 00:08:42.240 INFO: Loaded 1 modules (387495 inline 8-bit counters): 387495 [0x2acbe4c, 0x2b2a7f3), 00:08:42.240 INFO: Loaded 1 PC tables (387495 PCs): 387495 [0x2b2a7f8,0x3114268), 00:08:42.240 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:08:42.240 INFO: A corpus is not provided, starting from an empty corpus 00:08:42.240 #2 INITED exec/s: 0 rss: 64Mb 00:08:42.240 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:42.240 This may also happen if the target rejected all inputs we tried so far 00:08:42.240 [2024-11-18 14:21:38.319852] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:42.240 [2024-11-18 14:21:38.319881] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.512 NEW_FUNC[1/717]: 0x47b978 in fuzz_nvm_reservation_acquire_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:597 00:08:42.512 NEW_FUNC[2/717]: 0x495e28 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:42.512 #8 NEW cov: 12294 ft: 12295 corp: 2/34b lim: 90 exec/s: 0 rss: 71Mb L: 33/33 MS: 1 InsertRepeatedBytes- 00:08:42.772 [2024-11-18 14:21:38.650789] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:42.772 [2024-11-18 14:21:38.650845] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.772 #9 NEW cov: 12424 ft: 13062 corp: 3/67b lim: 90 exec/s: 0 rss: 72Mb L: 33/33 MS: 1 CopyPart- 00:08:42.772 [2024-11-18 14:21:38.720756] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:42.772 [2024-11-18 14:21:38.720785] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.772 #10 NEW cov: 12430 ft: 13303 corp: 4/100b lim: 90 exec/s: 0 rss: 72Mb L: 33/33 MS: 1 ChangeByte- 00:08:42.772 [2024-11-18 14:21:38.760986] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:42.772 [2024-11-18 14:21:38.761015] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.772 [2024-11-18 14:21:38.761073] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:42.772 [2024-11-18 14:21:38.761090] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.772 #11 NEW cov: 12515 ft: 14341 corp: 5/144b lim: 90 exec/s: 0 rss: 72Mb L: 44/44 MS: 1 CopyPart- 00:08:42.772 [2024-11-18 14:21:38.801119] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:42.772 [2024-11-18 14:21:38.801145] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.772 [2024-11-18 14:21:38.801199] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:42.772 [2024-11-18 14:21:38.801215] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.772 #12 NEW cov: 12515 ft: 14441 corp: 6/188b lim: 90 exec/s: 0 rss: 72Mb L: 44/44 MS: 1 ChangeByte- 00:08:42.772 [2024-11-18 14:21:38.861248] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:42.772 [2024-11-18 14:21:38.861276] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.772 [2024-11-18 14:21:38.861333] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:42.772 [2024-11-18 14:21:38.861349] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.772 #18 NEW cov: 12515 ft: 14540 corp: 7/232b lim: 90 exec/s: 0 rss: 72Mb L: 44/44 MS: 1 ChangeBinInt- 00:08:43.033 [2024-11-18 14:21:38.901698] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:43.033 [2024-11-18 14:21:38.901727] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.033 [2024-11-18 14:21:38.901776] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:43.033 [2024-11-18 14:21:38.901792] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.033 [2024-11-18 14:21:38.901848] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:43.033 [2024-11-18 14:21:38.901863] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:43.033 [2024-11-18 14:21:38.901919] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:43.033 [2024-11-18 14:21:38.901934] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:43.033 #19 NEW cov: 12515 ft: 15053 corp: 8/314b lim: 90 exec/s: 0 rss: 72Mb L: 82/82 MS: 1 InsertRepeatedBytes- 00:08:43.033 [2024-11-18 14:21:38.941506] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:43.033 [2024-11-18 14:21:38.941532] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.033 [2024-11-18 14:21:38.941575] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:43.033 [2024-11-18 14:21:38.941607] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.033 #20 NEW cov: 12515 ft: 15089 corp: 9/358b lim: 90 exec/s: 0 rss: 72Mb L: 44/82 MS: 1 ChangeBinInt- 00:08:43.033 [2024-11-18 14:21:39.001682] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:43.034 [2024-11-18 14:21:39.001707] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.034 [2024-11-18 14:21:39.001745] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:43.034 [2024-11-18 14:21:39.001761] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.034 #21 NEW cov: 12515 ft: 15115 corp: 10/402b lim: 90 exec/s: 0 rss: 72Mb L: 44/82 MS: 1 ChangeBinInt- 00:08:43.034 [2024-11-18 14:21:39.042100] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:43.034 [2024-11-18 14:21:39.042127] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.034 [2024-11-18 14:21:39.042197] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:43.034 [2024-11-18 14:21:39.042214] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.034 [2024-11-18 14:21:39.042270] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:43.034 [2024-11-18 14:21:39.042286] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:43.034 [2024-11-18 14:21:39.042340] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:43.034 [2024-11-18 14:21:39.042356] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:43.034 #22 NEW cov: 12515 ft: 15190 corp: 11/487b lim: 90 exec/s: 0 rss: 72Mb L: 85/85 MS: 1 InsertRepeatedBytes- 00:08:43.034 [2024-11-18 14:21:39.101941] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:43.034 [2024-11-18 14:21:39.101968] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.034 [2024-11-18 14:21:39.102038] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:43.034 [2024-11-18 14:21:39.102054] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.034 #23 NEW cov: 12515 ft: 15216 corp: 12/531b lim: 90 exec/s: 0 rss: 72Mb L: 44/85 MS: 1 ShuffleBytes- 00:08:43.034 [2024-11-18 14:21:39.142049] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:43.034 [2024-11-18 14:21:39.142075] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.034 [2024-11-18 14:21:39.142129] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:43.034 [2024-11-18 14:21:39.142146] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.294 #24 NEW cov: 12515 ft: 15272 corp: 13/575b lim: 90 exec/s: 0 rss: 72Mb L: 44/85 MS: 1 ChangeByte- 00:08:43.294 [2024-11-18 14:21:39.182193] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:43.294 [2024-11-18 14:21:39.182219] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.294 [2024-11-18 14:21:39.182260] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:43.294 [2024-11-18 14:21:39.182276] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.294 NEW_FUNC[1/1]: 0x1c52b68 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:43.294 #25 NEW cov: 12538 ft: 15326 corp: 14/627b lim: 90 exec/s: 0 rss: 72Mb L: 52/85 MS: 1 CopyPart- 00:08:43.294 [2024-11-18 14:21:39.242330] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:43.294 [2024-11-18 14:21:39.242356] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.294 [2024-11-18 14:21:39.242394] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:43.294 [2024-11-18 14:21:39.242410] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.294 #26 NEW cov: 12538 ft: 15341 corp: 15/671b lim: 90 exec/s: 0 rss: 73Mb L: 44/85 MS: 1 ShuffleBytes- 00:08:43.294 [2024-11-18 14:21:39.302379] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:43.294 [2024-11-18 14:21:39.302406] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.294 #27 NEW cov: 12538 ft: 15384 corp: 16/700b lim: 90 exec/s: 27 rss: 73Mb L: 29/85 MS: 1 EraseBytes- 00:08:43.294 [2024-11-18 14:21:39.362699] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:43.294 [2024-11-18 14:21:39.362725] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.294 [2024-11-18 14:21:39.362781] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:43.294 [2024-11-18 14:21:39.362797] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.294 #28 NEW cov: 12538 ft: 15401 corp: 17/752b lim: 90 exec/s: 28 rss: 73Mb L: 52/85 MS: 1 CrossOver- 00:08:43.554 [2024-11-18 14:21:39.422753] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:43.554 [2024-11-18 14:21:39.422782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.554 #29 NEW cov: 12538 ft: 15450 corp: 18/781b lim: 90 exec/s: 29 rss: 73Mb L: 29/85 MS: 1 ShuffleBytes- 00:08:43.554 [2024-11-18 14:21:39.483349] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:43.554 [2024-11-18 14:21:39.483377] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.554 [2024-11-18 14:21:39.483441] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:43.554 [2024-11-18 14:21:39.483457] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.554 [2024-11-18 14:21:39.483514] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:43.554 [2024-11-18 14:21:39.483530] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:43.554 [2024-11-18 14:21:39.483590] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:43.554 [2024-11-18 14:21:39.483606] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:43.554 #30 NEW cov: 12538 ft: 15465 corp: 19/859b lim: 90 exec/s: 30 rss: 73Mb L: 78/85 MS: 1 CopyPart- 00:08:43.554 [2024-11-18 14:21:39.523181] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:43.554 [2024-11-18 14:21:39.523207] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.554 [2024-11-18 14:21:39.523260] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:43.554 [2024-11-18 14:21:39.523277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.554 #31 NEW cov: 12538 ft: 15473 corp: 20/904b lim: 90 exec/s: 31 rss: 73Mb L: 45/85 MS: 1 InsertByte- 00:08:43.554 [2024-11-18 14:21:39.563239] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:43.554 [2024-11-18 14:21:39.563265] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.554 [2024-11-18 14:21:39.563328] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:43.554 [2024-11-18 14:21:39.563346] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.554 #32 NEW cov: 12538 ft: 15480 corp: 21/948b lim: 90 exec/s: 32 rss: 73Mb L: 44/85 MS: 1 ChangeByte- 00:08:43.554 [2024-11-18 14:21:39.603661] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:43.554 [2024-11-18 14:21:39.603687] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.554 [2024-11-18 14:21:39.603734] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:43.554 [2024-11-18 14:21:39.603749] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.554 [2024-11-18 14:21:39.603805] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:43.554 [2024-11-18 14:21:39.603819] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:43.554 [2024-11-18 14:21:39.603874] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:43.554 [2024-11-18 14:21:39.603888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:43.554 #33 NEW cov: 12538 ft: 15487 corp: 22/1033b lim: 90 exec/s: 33 rss: 73Mb L: 85/85 MS: 1 ChangeBit- 00:08:43.554 [2024-11-18 14:21:39.663553] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:43.554 [2024-11-18 14:21:39.663581] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.554 [2024-11-18 14:21:39.663620] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:43.554 [2024-11-18 14:21:39.663636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.815 #34 NEW cov: 12538 ft: 15516 corp: 23/1077b lim: 90 exec/s: 34 rss: 73Mb L: 44/85 MS: 1 ChangeByte- 00:08:43.815 [2024-11-18 14:21:39.703657] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:43.815 [2024-11-18 14:21:39.703683] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.815 [2024-11-18 14:21:39.703722] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:43.815 [2024-11-18 14:21:39.703738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.815 #35 NEW cov: 12538 ft: 15563 corp: 24/1121b lim: 90 exec/s: 35 rss: 73Mb L: 44/85 MS: 1 ChangeBinInt- 00:08:43.815 [2024-11-18 14:21:39.743780] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:43.815 [2024-11-18 14:21:39.743807] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.815 [2024-11-18 14:21:39.743863] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:43.815 [2024-11-18 14:21:39.743880] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.815 #36 NEW cov: 12538 ft: 15625 corp: 25/1166b lim: 90 exec/s: 36 rss: 73Mb L: 45/85 MS: 1 ChangeBinInt- 00:08:43.815 [2024-11-18 14:21:39.804263] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:43.815 [2024-11-18 14:21:39.804289] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.815 [2024-11-18 14:21:39.804336] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:43.815 [2024-11-18 14:21:39.804351] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.815 [2024-11-18 14:21:39.804406] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:43.815 [2024-11-18 14:21:39.804421] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:43.815 [2024-11-18 14:21:39.804477] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:43.815 [2024-11-18 14:21:39.804493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:43.815 #37 NEW cov: 12538 ft: 15631 corp: 26/1249b lim: 90 exec/s: 37 rss: 73Mb L: 83/85 MS: 1 InsertByte- 00:08:43.815 [2024-11-18 14:21:39.863977] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:43.815 [2024-11-18 14:21:39.864004] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.815 #38 NEW cov: 12538 ft: 15652 corp: 27/1278b lim: 90 exec/s: 38 rss: 73Mb L: 29/85 MS: 1 CopyPart- 00:08:43.815 [2024-11-18 14:21:39.904217] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:43.815 [2024-11-18 14:21:39.904242] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.815 [2024-11-18 14:21:39.904296] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:43.815 [2024-11-18 14:21:39.904313] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.815 #44 NEW cov: 12538 ft: 15661 corp: 28/1322b lim: 90 exec/s: 44 rss: 73Mb L: 44/85 MS: 1 ChangeByte- 00:08:44.075 [2024-11-18 14:21:39.944334] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:44.076 [2024-11-18 14:21:39.944363] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.076 [2024-11-18 14:21:39.944424] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:44.076 [2024-11-18 14:21:39.944441] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.076 #45 NEW cov: 12538 ft: 15671 corp: 29/1366b lim: 90 exec/s: 45 rss: 73Mb L: 44/85 MS: 1 ChangeBit- 00:08:44.076 [2024-11-18 14:21:39.984469] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:44.076 [2024-11-18 14:21:39.984496] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.076 [2024-11-18 14:21:39.984537] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:44.076 [2024-11-18 14:21:39.984559] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.076 #46 NEW cov: 12538 ft: 15681 corp: 30/1410b lim: 90 exec/s: 46 rss: 73Mb L: 44/85 MS: 1 ChangeByte- 00:08:44.076 [2024-11-18 14:21:40.024435] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:44.076 [2024-11-18 14:21:40.024469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.076 #47 NEW cov: 12538 ft: 15729 corp: 31/1444b lim: 90 exec/s: 47 rss: 73Mb L: 34/85 MS: 1 InsertByte- 00:08:44.076 [2024-11-18 14:21:40.065082] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:44.076 [2024-11-18 14:21:40.065110] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.076 [2024-11-18 14:21:40.065157] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:44.076 [2024-11-18 14:21:40.065173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.076 [2024-11-18 14:21:40.065232] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:44.076 [2024-11-18 14:21:40.065250] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:44.076 [2024-11-18 14:21:40.065307] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:44.076 [2024-11-18 14:21:40.065324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:44.076 #48 NEW cov: 12538 ft: 15736 corp: 32/1527b lim: 90 exec/s: 48 rss: 73Mb L: 83/85 MS: 1 CrossOver- 00:08:44.076 [2024-11-18 14:21:40.124907] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:44.076 [2024-11-18 14:21:40.124935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.076 [2024-11-18 14:21:40.125004] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:44.076 [2024-11-18 14:21:40.125020] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.076 #51 NEW cov: 12538 ft: 15752 corp: 33/1574b lim: 90 exec/s: 51 rss: 73Mb L: 47/85 MS: 3 CopyPart-ChangeByte-InsertRepeatedBytes- 00:08:44.076 [2024-11-18 14:21:40.165033] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:44.076 [2024-11-18 14:21:40.165058] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.076 [2024-11-18 14:21:40.165112] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:44.076 [2024-11-18 14:21:40.165130] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.336 #52 NEW cov: 12538 ft: 15778 corp: 34/1615b lim: 90 exec/s: 52 rss: 74Mb L: 41/85 MS: 1 EraseBytes- 00:08:44.336 [2024-11-18 14:21:40.225366] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:44.336 [2024-11-18 14:21:40.225393] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.336 [2024-11-18 14:21:40.225460] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:44.336 [2024-11-18 14:21:40.225477] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.336 [2024-11-18 14:21:40.225536] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:44.336 [2024-11-18 14:21:40.225554] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:44.336 #53 NEW cov: 12538 ft: 16063 corp: 35/1680b lim: 90 exec/s: 53 rss: 74Mb L: 65/85 MS: 1 CrossOver- 00:08:44.336 [2024-11-18 14:21:40.265336] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:44.336 [2024-11-18 14:21:40.265362] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.336 [2024-11-18 14:21:40.265424] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:44.336 [2024-11-18 14:21:40.265439] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.336 #54 NEW cov: 12538 ft: 16067 corp: 36/1724b lim: 90 exec/s: 54 rss: 74Mb L: 44/85 MS: 1 ShuffleBytes- 00:08:44.336 [2024-11-18 14:21:40.305748] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:44.336 [2024-11-18 14:21:40.305776] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.336 [2024-11-18 14:21:40.305829] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:44.336 [2024-11-18 14:21:40.305845] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.336 [2024-11-18 14:21:40.305901] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:44.337 [2024-11-18 14:21:40.305918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:44.337 [2024-11-18 14:21:40.305975] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:44.337 [2024-11-18 14:21:40.305992] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:44.337 #55 NEW cov: 12538 ft: 16079 corp: 37/1810b lim: 90 exec/s: 27 rss: 74Mb L: 86/86 MS: 1 CrossOver- 00:08:44.337 #55 DONE cov: 12538 ft: 16079 corp: 37/1810b lim: 90 exec/s: 27 rss: 74Mb 00:08:44.337 Done 55 runs in 2 second(s) 00:08:44.337 14:21:40 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_20.conf /var/tmp/suppress_nvmf_fuzz 00:08:44.337 14:21:40 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:44.337 14:21:40 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:44.337 14:21:40 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 21 1 0x1 00:08:44.337 14:21:40 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=21 00:08:44.337 14:21:40 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:44.337 14:21:40 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:44.337 14:21:40 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:08:44.337 14:21:40 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_21.conf 00:08:44.337 14:21:40 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:44.337 14:21:40 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:44.337 14:21:40 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 21 00:08:44.337 14:21:40 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4421 00:08:44.337 14:21:40 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:08:44.337 14:21:40 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4421' 00:08:44.337 14:21:40 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4421"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:44.337 14:21:40 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:44.337 14:21:40 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:44.337 14:21:40 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4421' -c /tmp/fuzz_json_21.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 -Z 21 00:08:44.597 [2024-11-18 14:21:40.467649] Starting SPDK v25.01-pre git sha1 d47eb51c9 / DPDK 23.11.0 initialization... 00:08:44.597 [2024-11-18 14:21:40.467713] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid331837 ] 00:08:44.597 [2024-11-18 14:21:40.666969] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:44.597 [2024-11-18 14:21:40.678912] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:44.856 [2024-11-18 14:21:40.731259] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:44.856 [2024-11-18 14:21:40.747591] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4421 *** 00:08:44.856 INFO: Running with entropic power schedule (0xFF, 100). 00:08:44.856 INFO: Seed: 463084359 00:08:44.856 INFO: Loaded 1 modules (387495 inline 8-bit counters): 387495 [0x2acbe4c, 0x2b2a7f3), 00:08:44.856 INFO: Loaded 1 PC tables (387495 PCs): 387495 [0x2b2a7f8,0x3114268), 00:08:44.856 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:08:44.856 INFO: A corpus is not provided, starting from an empty corpus 00:08:44.856 #2 INITED exec/s: 0 rss: 65Mb 00:08:44.856 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:44.856 This may also happen if the target rejected all inputs we tried so far 00:08:44.856 [2024-11-18 14:21:40.824998] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:44.856 [2024-11-18 14:21:40.825041] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.856 [2024-11-18 14:21:40.825154] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:44.856 [2024-11-18 14:21:40.825177] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.856 [2024-11-18 14:21:40.825301] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:44.856 [2024-11-18 14:21:40.825325] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:44.856 [2024-11-18 14:21:40.825453] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:44.856 [2024-11-18 14:21:40.825480] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:44.856 [2024-11-18 14:21:40.825620] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:08:44.856 [2024-11-18 14:21:40.825647] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:45.117 NEW_FUNC[1/715]: 0x47eba8 in fuzz_nvm_reservation_release_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:623 00:08:45.117 NEW_FUNC[2/715]: 0x495e28 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:45.117 #6 NEW cov: 12236 ft: 12237 corp: 2/51b lim: 50 exec/s: 0 rss: 72Mb L: 50/50 MS: 4 CopyPart-CrossOver-ChangeByte-InsertRepeatedBytes- 00:08:45.117 [2024-11-18 14:21:41.166019] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:45.117 [2024-11-18 14:21:41.166086] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:45.117 [2024-11-18 14:21:41.166238] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:45.117 [2024-11-18 14:21:41.166277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:45.117 [2024-11-18 14:21:41.166435] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:45.117 [2024-11-18 14:21:41.166463] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:45.117 [2024-11-18 14:21:41.166613] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:45.117 [2024-11-18 14:21:41.166641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:45.117 [2024-11-18 14:21:41.166788] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:08:45.117 [2024-11-18 14:21:41.166822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:45.117 NEW_FUNC[1/2]: 0x108e878 in posix_sock_read /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/module/sock/posix/posix.c:1527 00:08:45.117 NEW_FUNC[2/2]: 0x2158fe8 in spdk_pipe_writer_get_buffer /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/util/pipe.c:92 00:08:45.117 #7 NEW cov: 12400 ft: 13057 corp: 3/101b lim: 50 exec/s: 0 rss: 72Mb L: 50/50 MS: 1 CrossOver- 00:08:45.377 [2024-11-18 14:21:41.245645] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:45.378 [2024-11-18 14:21:41.245685] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:45.378 [2024-11-18 14:21:41.245799] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:45.378 [2024-11-18 14:21:41.245825] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:45.378 [2024-11-18 14:21:41.245955] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:45.378 [2024-11-18 14:21:41.245982] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:45.378 [2024-11-18 14:21:41.246105] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:45.378 [2024-11-18 14:21:41.246129] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:45.378 #8 NEW cov: 12406 ft: 13271 corp: 4/143b lim: 50 exec/s: 0 rss: 72Mb L: 42/50 MS: 1 InsertRepeatedBytes- 00:08:45.378 [2024-11-18 14:21:41.295269] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:45.378 [2024-11-18 14:21:41.295306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:45.378 [2024-11-18 14:21:41.295432] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:45.378 [2024-11-18 14:21:41.295458] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:45.378 #11 NEW cov: 12491 ft: 13890 corp: 5/166b lim: 50 exec/s: 0 rss: 72Mb L: 23/50 MS: 3 InsertByte-InsertByte-InsertRepeatedBytes- 00:08:45.378 [2024-11-18 14:21:41.346192] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:45.378 [2024-11-18 14:21:41.346225] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:45.378 [2024-11-18 14:21:41.346298] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:45.378 [2024-11-18 14:21:41.346320] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:45.378 [2024-11-18 14:21:41.346442] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:45.378 [2024-11-18 14:21:41.346466] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:45.378 [2024-11-18 14:21:41.346601] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:45.378 [2024-11-18 14:21:41.346623] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:45.378 [2024-11-18 14:21:41.346751] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:08:45.378 [2024-11-18 14:21:41.346772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:45.378 #12 NEW cov: 12491 ft: 13969 corp: 6/216b lim: 50 exec/s: 0 rss: 72Mb L: 50/50 MS: 1 ShuffleBytes- 00:08:45.378 [2024-11-18 14:21:41.416509] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:45.378 [2024-11-18 14:21:41.416543] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:45.378 [2024-11-18 14:21:41.416623] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:45.378 [2024-11-18 14:21:41.416643] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:45.378 [2024-11-18 14:21:41.416761] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:45.378 [2024-11-18 14:21:41.416783] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:45.378 [2024-11-18 14:21:41.416907] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:45.378 [2024-11-18 14:21:41.416930] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:45.378 [2024-11-18 14:21:41.417058] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:08:45.378 [2024-11-18 14:21:41.417078] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:45.378 #13 NEW cov: 12491 ft: 14039 corp: 7/266b lim: 50 exec/s: 0 rss: 72Mb L: 50/50 MS: 1 ChangeBit- 00:08:45.378 [2024-11-18 14:21:41.466562] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:45.378 [2024-11-18 14:21:41.466599] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:45.378 [2024-11-18 14:21:41.466680] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:45.378 [2024-11-18 14:21:41.466701] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:45.378 [2024-11-18 14:21:41.466836] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:45.378 [2024-11-18 14:21:41.466862] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:45.378 [2024-11-18 14:21:41.466982] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:45.378 [2024-11-18 14:21:41.467003] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:45.378 [2024-11-18 14:21:41.467147] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:08:45.378 [2024-11-18 14:21:41.467171] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:45.639 #14 NEW cov: 12491 ft: 14091 corp: 8/316b lim: 50 exec/s: 0 rss: 72Mb L: 50/50 MS: 1 CopyPart- 00:08:45.639 [2024-11-18 14:21:41.536760] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:45.639 [2024-11-18 14:21:41.536792] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:45.639 [2024-11-18 14:21:41.536882] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:45.639 [2024-11-18 14:21:41.536905] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:45.639 [2024-11-18 14:21:41.537023] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:45.639 [2024-11-18 14:21:41.537047] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:45.639 [2024-11-18 14:21:41.537167] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:45.639 [2024-11-18 14:21:41.537188] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:45.639 [2024-11-18 14:21:41.537312] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:08:45.639 [2024-11-18 14:21:41.537333] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:45.639 #15 NEW cov: 12491 ft: 14114 corp: 9/366b lim: 50 exec/s: 0 rss: 72Mb L: 50/50 MS: 1 ChangeBit- 00:08:45.639 [2024-11-18 14:21:41.606994] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:45.639 [2024-11-18 14:21:41.607025] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:45.639 [2024-11-18 14:21:41.607140] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:45.639 [2024-11-18 14:21:41.607163] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:45.639 [2024-11-18 14:21:41.607280] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:45.639 [2024-11-18 14:21:41.607305] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:45.639 [2024-11-18 14:21:41.607427] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:45.639 [2024-11-18 14:21:41.607447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:45.639 [2024-11-18 14:21:41.607579] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:08:45.639 [2024-11-18 14:21:41.607600] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:45.639 #16 NEW cov: 12491 ft: 14123 corp: 10/416b lim: 50 exec/s: 0 rss: 73Mb L: 50/50 MS: 1 ChangeBinInt- 00:08:45.639 [2024-11-18 14:21:41.677196] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:45.639 [2024-11-18 14:21:41.677230] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:45.639 [2024-11-18 14:21:41.677317] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:45.639 [2024-11-18 14:21:41.677337] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:45.639 [2024-11-18 14:21:41.677476] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:45.639 [2024-11-18 14:21:41.677496] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:45.639 [2024-11-18 14:21:41.677637] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:45.640 [2024-11-18 14:21:41.677658] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:45.640 [2024-11-18 14:21:41.677786] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:08:45.640 [2024-11-18 14:21:41.677808] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:45.640 NEW_FUNC[1/1]: 0x1c52b68 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:45.640 #17 NEW cov: 12514 ft: 14171 corp: 11/466b lim: 50 exec/s: 0 rss: 73Mb L: 50/50 MS: 1 ShuffleBytes- 00:08:45.640 [2024-11-18 14:21:41.727312] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:45.640 [2024-11-18 14:21:41.727347] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:45.640 [2024-11-18 14:21:41.727456] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:45.640 [2024-11-18 14:21:41.727480] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:45.640 [2024-11-18 14:21:41.727606] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:45.640 [2024-11-18 14:21:41.727628] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:45.640 [2024-11-18 14:21:41.727744] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:45.640 [2024-11-18 14:21:41.727767] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:45.640 [2024-11-18 14:21:41.727896] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:08:45.640 [2024-11-18 14:21:41.727922] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:45.901 #18 NEW cov: 12514 ft: 14183 corp: 12/516b lim: 50 exec/s: 0 rss: 73Mb L: 50/50 MS: 1 ShuffleBytes- 00:08:45.901 [2024-11-18 14:21:41.797539] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:45.901 [2024-11-18 14:21:41.797576] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:45.901 [2024-11-18 14:21:41.797671] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:45.901 [2024-11-18 14:21:41.797697] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:45.901 [2024-11-18 14:21:41.797814] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:45.901 [2024-11-18 14:21:41.797838] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:45.901 [2024-11-18 14:21:41.797958] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:45.901 [2024-11-18 14:21:41.797983] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:45.901 [2024-11-18 14:21:41.798104] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:08:45.901 [2024-11-18 14:21:41.798124] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:45.901 #19 NEW cov: 12514 ft: 14214 corp: 13/566b lim: 50 exec/s: 19 rss: 73Mb L: 50/50 MS: 1 CMP- DE: "\000\000\000\000\000\000\000\006"- 00:08:45.901 [2024-11-18 14:21:41.867838] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:45.901 [2024-11-18 14:21:41.867873] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:45.901 [2024-11-18 14:21:41.867966] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:45.901 [2024-11-18 14:21:41.867997] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:45.901 [2024-11-18 14:21:41.868117] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:45.901 [2024-11-18 14:21:41.868135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:45.901 [2024-11-18 14:21:41.868269] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:45.901 [2024-11-18 14:21:41.868291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:45.901 [2024-11-18 14:21:41.868414] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:08:45.901 [2024-11-18 14:21:41.868438] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:45.901 #20 NEW cov: 12514 ft: 14233 corp: 14/616b lim: 50 exec/s: 20 rss: 73Mb L: 50/50 MS: 1 ChangeByte- 00:08:45.901 [2024-11-18 14:21:41.917931] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:45.901 [2024-11-18 14:21:41.917960] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:45.901 [2024-11-18 14:21:41.918036] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:45.901 [2024-11-18 14:21:41.918060] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:45.901 [2024-11-18 14:21:41.918190] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:45.901 [2024-11-18 14:21:41.918213] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:45.901 [2024-11-18 14:21:41.918334] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:45.901 [2024-11-18 14:21:41.918361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:45.901 [2024-11-18 14:21:41.918486] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:08:45.901 [2024-11-18 14:21:41.918509] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:45.901 #21 NEW cov: 12514 ft: 14285 corp: 15/666b lim: 50 exec/s: 21 rss: 73Mb L: 50/50 MS: 1 ChangeBit- 00:08:45.901 [2024-11-18 14:21:41.988227] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:45.901 [2024-11-18 14:21:41.988256] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:45.901 [2024-11-18 14:21:41.988342] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:45.901 [2024-11-18 14:21:41.988360] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:45.901 [2024-11-18 14:21:41.988486] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:45.901 [2024-11-18 14:21:41.988514] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:45.901 [2024-11-18 14:21:41.988637] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:45.901 [2024-11-18 14:21:41.988661] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:45.901 [2024-11-18 14:21:41.988781] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:08:45.901 [2024-11-18 14:21:41.988803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:46.162 #22 NEW cov: 12514 ft: 14303 corp: 16/716b lim: 50 exec/s: 22 rss: 73Mb L: 50/50 MS: 1 ChangeByte- 00:08:46.162 [2024-11-18 14:21:42.058383] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:46.162 [2024-11-18 14:21:42.058418] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.162 [2024-11-18 14:21:42.058539] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:46.162 [2024-11-18 14:21:42.058563] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.162 [2024-11-18 14:21:42.058692] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:46.162 [2024-11-18 14:21:42.058714] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:46.162 [2024-11-18 14:21:42.058846] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:46.162 [2024-11-18 14:21:42.058864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:46.162 [2024-11-18 14:21:42.058995] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:08:46.162 [2024-11-18 14:21:42.059019] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:46.162 #23 NEW cov: 12514 ft: 14363 corp: 17/766b lim: 50 exec/s: 23 rss: 73Mb L: 50/50 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\000\006"- 00:08:46.162 [2024-11-18 14:21:42.108520] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:46.162 [2024-11-18 14:21:42.108553] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.162 [2024-11-18 14:21:42.108622] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:46.162 [2024-11-18 14:21:42.108643] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.162 [2024-11-18 14:21:42.108767] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:46.162 [2024-11-18 14:21:42.108787] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:46.162 [2024-11-18 14:21:42.108905] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:46.162 [2024-11-18 14:21:42.108926] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:46.162 [2024-11-18 14:21:42.109049] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:08:46.162 [2024-11-18 14:21:42.109070] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:46.162 #24 NEW cov: 12514 ft: 14384 corp: 18/816b lim: 50 exec/s: 24 rss: 73Mb L: 50/50 MS: 1 CopyPart- 00:08:46.162 [2024-11-18 14:21:42.157944] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:46.162 [2024-11-18 14:21:42.157977] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.162 [2024-11-18 14:21:42.158096] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:46.162 [2024-11-18 14:21:42.158119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.162 #25 NEW cov: 12514 ft: 14434 corp: 19/845b lim: 50 exec/s: 25 rss: 73Mb L: 29/50 MS: 1 EraseBytes- 00:08:46.162 [2024-11-18 14:21:42.229035] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:46.162 [2024-11-18 14:21:42.229065] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.162 [2024-11-18 14:21:42.229145] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:46.162 [2024-11-18 14:21:42.229163] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.162 [2024-11-18 14:21:42.229279] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:46.162 [2024-11-18 14:21:42.229298] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:46.162 [2024-11-18 14:21:42.229414] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:46.162 [2024-11-18 14:21:42.229433] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:46.162 [2024-11-18 14:21:42.229552] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:08:46.162 [2024-11-18 14:21:42.229574] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:46.162 #26 NEW cov: 12514 ft: 14509 corp: 20/895b lim: 50 exec/s: 26 rss: 73Mb L: 50/50 MS: 1 CopyPart- 00:08:46.162 [2024-11-18 14:21:42.278324] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:46.162 [2024-11-18 14:21:42.278355] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.162 [2024-11-18 14:21:42.278461] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:46.162 [2024-11-18 14:21:42.278485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.423 #27 NEW cov: 12514 ft: 14531 corp: 21/921b lim: 50 exec/s: 27 rss: 73Mb L: 26/50 MS: 1 EraseBytes- 00:08:46.423 [2024-11-18 14:21:42.329327] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:46.423 [2024-11-18 14:21:42.329361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.423 [2024-11-18 14:21:42.329446] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:46.423 [2024-11-18 14:21:42.329463] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.423 [2024-11-18 14:21:42.329588] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:46.423 [2024-11-18 14:21:42.329613] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:46.424 [2024-11-18 14:21:42.329740] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:46.424 [2024-11-18 14:21:42.329760] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:46.424 [2024-11-18 14:21:42.329887] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:08:46.424 [2024-11-18 14:21:42.329906] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:46.424 #28 NEW cov: 12514 ft: 14567 corp: 22/971b lim: 50 exec/s: 28 rss: 73Mb L: 50/50 MS: 1 ChangeASCIIInt- 00:08:46.424 [2024-11-18 14:21:42.379231] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:46.424 [2024-11-18 14:21:42.379264] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.424 [2024-11-18 14:21:42.379402] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:46.424 [2024-11-18 14:21:42.379426] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.424 [2024-11-18 14:21:42.379553] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:46.424 [2024-11-18 14:21:42.379576] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:46.424 [2024-11-18 14:21:42.379696] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:46.424 [2024-11-18 14:21:42.379718] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:46.424 #29 NEW cov: 12514 ft: 14606 corp: 23/1018b lim: 50 exec/s: 29 rss: 73Mb L: 47/50 MS: 1 EraseBytes- 00:08:46.424 [2024-11-18 14:21:42.449618] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:46.424 [2024-11-18 14:21:42.449648] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.424 [2024-11-18 14:21:42.449756] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:46.424 [2024-11-18 14:21:42.449780] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.424 [2024-11-18 14:21:42.449898] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:46.424 [2024-11-18 14:21:42.449922] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:46.424 [2024-11-18 14:21:42.450039] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:46.424 [2024-11-18 14:21:42.450060] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:46.424 [2024-11-18 14:21:42.450181] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:08:46.424 [2024-11-18 14:21:42.450204] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:46.424 #30 NEW cov: 12514 ft: 14616 corp: 24/1068b lim: 50 exec/s: 30 rss: 73Mb L: 50/50 MS: 1 ChangeBit- 00:08:46.424 [2024-11-18 14:21:42.519856] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:46.424 [2024-11-18 14:21:42.519889] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.424 [2024-11-18 14:21:42.519997] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:46.424 [2024-11-18 14:21:42.520019] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.424 [2024-11-18 14:21:42.520145] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:46.424 [2024-11-18 14:21:42.520167] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:46.424 [2024-11-18 14:21:42.520291] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:46.424 [2024-11-18 14:21:42.520313] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:46.424 [2024-11-18 14:21:42.520435] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:08:46.424 [2024-11-18 14:21:42.520460] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:46.685 #31 NEW cov: 12514 ft: 14621 corp: 25/1118b lim: 50 exec/s: 31 rss: 73Mb L: 50/50 MS: 1 CrossOver- 00:08:46.685 [2024-11-18 14:21:42.589794] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:46.685 [2024-11-18 14:21:42.589828] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.685 [2024-11-18 14:21:42.589946] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:46.685 [2024-11-18 14:21:42.589970] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.685 [2024-11-18 14:21:42.590089] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:46.685 [2024-11-18 14:21:42.590110] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:46.685 [2024-11-18 14:21:42.590241] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:46.685 [2024-11-18 14:21:42.590263] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:46.685 #32 NEW cov: 12514 ft: 14652 corp: 26/1165b lim: 50 exec/s: 32 rss: 74Mb L: 47/50 MS: 1 InsertRepeatedBytes- 00:08:46.685 [2024-11-18 14:21:42.660350] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:46.685 [2024-11-18 14:21:42.660386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.685 [2024-11-18 14:21:42.660504] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:46.685 [2024-11-18 14:21:42.660530] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.685 [2024-11-18 14:21:42.660657] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:46.685 [2024-11-18 14:21:42.660681] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:46.685 [2024-11-18 14:21:42.660804] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:46.685 [2024-11-18 14:21:42.660829] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:46.685 [2024-11-18 14:21:42.660955] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:08:46.685 [2024-11-18 14:21:42.660979] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:46.685 #33 NEW cov: 12514 ft: 14681 corp: 27/1215b lim: 50 exec/s: 33 rss: 74Mb L: 50/50 MS: 1 ShuffleBytes- 00:08:46.685 [2024-11-18 14:21:42.730431] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:46.685 [2024-11-18 14:21:42.730462] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.685 [2024-11-18 14:21:42.730547] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:46.685 [2024-11-18 14:21:42.730579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.685 [2024-11-18 14:21:42.730701] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:46.685 [2024-11-18 14:21:42.730727] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:46.685 [2024-11-18 14:21:42.730856] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:46.685 [2024-11-18 14:21:42.730880] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:46.685 [2024-11-18 14:21:42.731004] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:08:46.685 [2024-11-18 14:21:42.731027] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:46.685 #34 NEW cov: 12514 ft: 14692 corp: 28/1265b lim: 50 exec/s: 34 rss: 74Mb L: 50/50 MS: 1 ChangeByte- 00:08:46.685 [2024-11-18 14:21:42.780559] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:46.685 [2024-11-18 14:21:42.780593] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.685 [2024-11-18 14:21:42.780676] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:46.685 [2024-11-18 14:21:42.780698] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.685 [2024-11-18 14:21:42.780822] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:46.685 [2024-11-18 14:21:42.780846] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:46.685 [2024-11-18 14:21:42.780969] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:46.685 [2024-11-18 14:21:42.780993] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:46.685 [2024-11-18 14:21:42.781121] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:08:46.685 [2024-11-18 14:21:42.781145] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:46.685 #35 NEW cov: 12514 ft: 14709 corp: 29/1315b lim: 50 exec/s: 17 rss: 74Mb L: 50/50 MS: 1 CMP- DE: "\000\213\213\300\030~Q\332"- 00:08:46.685 #35 DONE cov: 12514 ft: 14709 corp: 29/1315b lim: 50 exec/s: 17 rss: 74Mb 00:08:46.685 ###### Recommended dictionary. ###### 00:08:46.685 "\000\000\000\000\000\000\000\006" # Uses: 1 00:08:46.685 "\000\213\213\300\030~Q\332" # Uses: 0 00:08:46.685 ###### End of recommended dictionary. ###### 00:08:46.685 Done 35 runs in 2 second(s) 00:08:46.946 14:21:42 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_21.conf /var/tmp/suppress_nvmf_fuzz 00:08:46.946 14:21:42 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:46.946 14:21:42 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:46.946 14:21:42 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 22 1 0x1 00:08:46.946 14:21:42 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=22 00:08:46.946 14:21:42 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:46.946 14:21:42 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:46.946 14:21:42 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:08:46.946 14:21:42 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_22.conf 00:08:46.946 14:21:42 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:46.946 14:21:42 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:46.946 14:21:42 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 22 00:08:46.946 14:21:42 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4422 00:08:46.946 14:21:42 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:08:46.946 14:21:42 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4422' 00:08:46.946 14:21:42 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4422"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:46.946 14:21:42 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:46.947 14:21:42 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:46.947 14:21:42 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4422' -c /tmp/fuzz_json_22.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 -Z 22 00:08:46.947 [2024-11-18 14:21:42.956701] Starting SPDK v25.01-pre git sha1 d47eb51c9 / DPDK 23.11.0 initialization... 00:08:46.947 [2024-11-18 14:21:42.956766] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid332323 ] 00:08:47.207 [2024-11-18 14:21:43.242945] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:47.208 [2024-11-18 14:21:43.265151] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:47.208 [2024-11-18 14:21:43.317459] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:47.208 [2024-11-18 14:21:43.333812] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4422 *** 00:08:47.469 INFO: Running with entropic power schedule (0xFF, 100). 00:08:47.469 INFO: Seed: 3049078562 00:08:47.469 INFO: Loaded 1 modules (387495 inline 8-bit counters): 387495 [0x2acbe4c, 0x2b2a7f3), 00:08:47.469 INFO: Loaded 1 PC tables (387495 PCs): 387495 [0x2b2a7f8,0x3114268), 00:08:47.469 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:08:47.469 INFO: A corpus is not provided, starting from an empty corpus 00:08:47.469 #2 INITED exec/s: 0 rss: 65Mb 00:08:47.469 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:47.469 This may also happen if the target rejected all inputs we tried so far 00:08:47.469 [2024-11-18 14:21:43.393273] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:47.469 [2024-11-18 14:21:43.393304] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.469 [2024-11-18 14:21:43.393366] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:47.469 [2024-11-18 14:21:43.393386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:47.469 [2024-11-18 14:21:43.393450] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:47.469 [2024-11-18 14:21:43.393470] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:47.729 NEW_FUNC[1/717]: 0x480e78 in fuzz_nvm_reservation_register_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:644 00:08:47.729 NEW_FUNC[2/717]: 0x495e28 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:47.729 #5 NEW cov: 12313 ft: 12306 corp: 2/60b lim: 85 exec/s: 0 rss: 73Mb L: 59/59 MS: 3 ChangeBit-ChangeBinInt-InsertRepeatedBytes- 00:08:47.729 [2024-11-18 14:21:43.724162] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:47.729 [2024-11-18 14:21:43.724250] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.729 #7 NEW cov: 12426 ft: 13944 corp: 3/77b lim: 85 exec/s: 0 rss: 73Mb L: 17/59 MS: 2 ChangeBit-CrossOver- 00:08:47.729 [2024-11-18 14:21:43.783923] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:47.729 [2024-11-18 14:21:43.783951] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.729 #8 NEW cov: 12432 ft: 14133 corp: 4/94b lim: 85 exec/s: 0 rss: 73Mb L: 17/59 MS: 1 CMP- DE: "\377\001\000\000"- 00:08:47.729 [2024-11-18 14:21:43.844045] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:47.729 [2024-11-18 14:21:43.844073] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.990 #9 NEW cov: 12517 ft: 14399 corp: 5/111b lim: 85 exec/s: 0 rss: 73Mb L: 17/59 MS: 1 CopyPart- 00:08:47.990 [2024-11-18 14:21:43.904252] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:47.990 [2024-11-18 14:21:43.904281] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.990 #11 NEW cov: 12517 ft: 14596 corp: 6/129b lim: 85 exec/s: 0 rss: 73Mb L: 18/59 MS: 2 ChangeBit-CrossOver- 00:08:47.990 [2024-11-18 14:21:43.944616] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:47.990 [2024-11-18 14:21:43.944643] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.990 [2024-11-18 14:21:43.944701] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:47.990 [2024-11-18 14:21:43.944722] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:47.990 [2024-11-18 14:21:43.944786] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:47.990 [2024-11-18 14:21:43.944808] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:47.990 #12 NEW cov: 12517 ft: 14686 corp: 7/184b lim: 85 exec/s: 0 rss: 73Mb L: 55/59 MS: 1 InsertRepeatedBytes- 00:08:47.990 [2024-11-18 14:21:43.984744] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:47.990 [2024-11-18 14:21:43.984771] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.990 [2024-11-18 14:21:43.984837] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:47.990 [2024-11-18 14:21:43.984858] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:47.990 [2024-11-18 14:21:43.984923] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:47.990 [2024-11-18 14:21:43.984945] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:47.990 #13 NEW cov: 12517 ft: 14727 corp: 8/243b lim: 85 exec/s: 0 rss: 73Mb L: 59/59 MS: 1 ChangeByte- 00:08:47.990 [2024-11-18 14:21:44.044629] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:47.990 [2024-11-18 14:21:44.044658] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.990 #14 NEW cov: 12517 ft: 14798 corp: 9/260b lim: 85 exec/s: 0 rss: 73Mb L: 17/59 MS: 1 ChangeBit- 00:08:47.990 [2024-11-18 14:21:44.084999] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:47.990 [2024-11-18 14:21:44.085027] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.990 [2024-11-18 14:21:44.085088] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:47.990 [2024-11-18 14:21:44.085109] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:47.990 [2024-11-18 14:21:44.085177] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:47.990 [2024-11-18 14:21:44.085199] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:48.251 #15 NEW cov: 12517 ft: 14829 corp: 10/325b lim: 85 exec/s: 0 rss: 73Mb L: 65/65 MS: 1 InsertRepeatedBytes- 00:08:48.251 [2024-11-18 14:21:44.145190] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:48.251 [2024-11-18 14:21:44.145219] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:48.251 [2024-11-18 14:21:44.145282] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:48.251 [2024-11-18 14:21:44.145303] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:48.251 [2024-11-18 14:21:44.145370] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:48.251 [2024-11-18 14:21:44.145388] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:48.251 #16 NEW cov: 12517 ft: 14890 corp: 11/384b lim: 85 exec/s: 0 rss: 73Mb L: 59/65 MS: 1 CopyPart- 00:08:48.251 [2024-11-18 14:21:44.184994] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:48.251 [2024-11-18 14:21:44.185022] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:48.251 #17 NEW cov: 12517 ft: 14927 corp: 12/401b lim: 85 exec/s: 0 rss: 73Mb L: 17/65 MS: 1 PersAutoDict- DE: "\377\001\000\000"- 00:08:48.251 [2024-11-18 14:21:44.245767] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:48.251 [2024-11-18 14:21:44.245795] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:48.251 [2024-11-18 14:21:44.245855] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:48.251 [2024-11-18 14:21:44.245876] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:48.251 [2024-11-18 14:21:44.245940] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:48.251 [2024-11-18 14:21:44.245962] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:48.251 [2024-11-18 14:21:44.246040] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:48.251 [2024-11-18 14:21:44.246060] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:48.251 [2024-11-18 14:21:44.246126] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:4 nsid:0 00:08:48.251 [2024-11-18 14:21:44.246145] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:48.251 NEW_FUNC[1/1]: 0x1c52b68 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:48.251 #18 NEW cov: 12540 ft: 15375 corp: 13/486b lim: 85 exec/s: 0 rss: 74Mb L: 85/85 MS: 1 InsertRepeatedBytes- 00:08:48.251 [2024-11-18 14:21:44.305920] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:48.251 [2024-11-18 14:21:44.305949] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:48.251 [2024-11-18 14:21:44.306008] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:48.251 [2024-11-18 14:21:44.306028] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:48.251 [2024-11-18 14:21:44.306093] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:48.251 [2024-11-18 14:21:44.306113] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:48.251 [2024-11-18 14:21:44.306179] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:48.251 [2024-11-18 14:21:44.306198] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:48.251 [2024-11-18 14:21:44.306262] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:4 nsid:0 00:08:48.251 [2024-11-18 14:21:44.306281] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:48.251 #19 NEW cov: 12540 ft: 15416 corp: 14/571b lim: 85 exec/s: 0 rss: 74Mb L: 85/85 MS: 1 ChangeBit- 00:08:48.251 [2024-11-18 14:21:44.365511] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:48.252 [2024-11-18 14:21:44.365540] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:48.512 #20 NEW cov: 12540 ft: 15481 corp: 15/588b lim: 85 exec/s: 20 rss: 74Mb L: 17/85 MS: 1 CMP- DE: "\001\000\377\377"- 00:08:48.512 [2024-11-18 14:21:44.406168] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:48.512 [2024-11-18 14:21:44.406196] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:48.512 [2024-11-18 14:21:44.406257] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:48.512 [2024-11-18 14:21:44.406278] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:48.512 [2024-11-18 14:21:44.406341] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:48.512 [2024-11-18 14:21:44.406359] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:48.512 [2024-11-18 14:21:44.406421] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:48.512 [2024-11-18 14:21:44.406440] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:48.512 [2024-11-18 14:21:44.406505] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:4 nsid:0 00:08:48.512 [2024-11-18 14:21:44.406524] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:48.512 #21 NEW cov: 12540 ft: 15502 corp: 16/673b lim: 85 exec/s: 21 rss: 74Mb L: 85/85 MS: 1 PersAutoDict- DE: "\001\000\377\377"- 00:08:48.512 [2024-11-18 14:21:44.445992] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:48.512 [2024-11-18 14:21:44.446019] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:48.512 [2024-11-18 14:21:44.446080] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:48.512 [2024-11-18 14:21:44.446100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:48.512 [2024-11-18 14:21:44.446167] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:48.512 [2024-11-18 14:21:44.446186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:48.512 #22 NEW cov: 12540 ft: 15540 corp: 17/733b lim: 85 exec/s: 22 rss: 74Mb L: 60/85 MS: 1 CopyPart- 00:08:48.512 [2024-11-18 14:21:44.485818] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:48.512 [2024-11-18 14:21:44.485846] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:48.512 #23 NEW cov: 12540 ft: 15582 corp: 18/750b lim: 85 exec/s: 23 rss: 74Mb L: 17/85 MS: 1 ChangeByte- 00:08:48.512 [2024-11-18 14:21:44.526103] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:48.512 [2024-11-18 14:21:44.526131] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:48.512 [2024-11-18 14:21:44.526200] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:48.512 [2024-11-18 14:21:44.526223] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:48.512 #24 NEW cov: 12540 ft: 15860 corp: 19/784b lim: 85 exec/s: 24 rss: 74Mb L: 34/85 MS: 1 EraseBytes- 00:08:48.512 [2024-11-18 14:21:44.566062] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:48.512 [2024-11-18 14:21:44.566090] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:48.512 #25 NEW cov: 12540 ft: 15877 corp: 20/813b lim: 85 exec/s: 25 rss: 74Mb L: 29/85 MS: 1 CrossOver- 00:08:48.512 [2024-11-18 14:21:44.626255] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:48.512 [2024-11-18 14:21:44.626285] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:48.772 #26 NEW cov: 12540 ft: 15887 corp: 21/831b lim: 85 exec/s: 26 rss: 74Mb L: 18/85 MS: 1 CopyPart- 00:08:48.772 [2024-11-18 14:21:44.666651] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:48.772 [2024-11-18 14:21:44.666679] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:48.772 [2024-11-18 14:21:44.666740] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:48.772 [2024-11-18 14:21:44.666760] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:48.772 [2024-11-18 14:21:44.666826] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:48.772 [2024-11-18 14:21:44.666844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:48.772 #27 NEW cov: 12540 ft: 15916 corp: 22/890b lim: 85 exec/s: 27 rss: 74Mb L: 59/85 MS: 1 ChangeBinInt- 00:08:48.772 [2024-11-18 14:21:44.706463] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:48.772 [2024-11-18 14:21:44.706492] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:48.772 #28 NEW cov: 12540 ft: 15946 corp: 23/908b lim: 85 exec/s: 28 rss: 74Mb L: 18/85 MS: 1 InsertByte- 00:08:48.772 [2024-11-18 14:21:44.767094] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:48.772 [2024-11-18 14:21:44.767120] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:48.772 [2024-11-18 14:21:44.767169] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:48.772 [2024-11-18 14:21:44.767184] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:48.772 [2024-11-18 14:21:44.767236] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:48.772 [2024-11-18 14:21:44.767251] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:48.772 [2024-11-18 14:21:44.767304] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:48.772 [2024-11-18 14:21:44.767318] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:48.772 #29 NEW cov: 12540 ft: 15960 corp: 24/981b lim: 85 exec/s: 29 rss: 74Mb L: 73/85 MS: 1 InsertRepeatedBytes- 00:08:48.772 [2024-11-18 14:21:44.826941] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:48.772 [2024-11-18 14:21:44.826967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:48.772 [2024-11-18 14:21:44.827036] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:48.772 [2024-11-18 14:21:44.827051] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:48.772 #30 NEW cov: 12540 ft: 15986 corp: 25/1029b lim: 85 exec/s: 30 rss: 74Mb L: 48/85 MS: 1 CrossOver- 00:08:48.772 [2024-11-18 14:21:44.867043] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:48.772 [2024-11-18 14:21:44.867068] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:48.772 [2024-11-18 14:21:44.867119] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:48.772 [2024-11-18 14:21:44.867135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.033 #31 NEW cov: 12540 ft: 15996 corp: 26/1064b lim: 85 exec/s: 31 rss: 74Mb L: 35/85 MS: 1 InsertByte- 00:08:49.033 [2024-11-18 14:21:44.927230] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:49.033 [2024-11-18 14:21:44.927256] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.033 [2024-11-18 14:21:44.927308] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:49.033 [2024-11-18 14:21:44.927325] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.033 #32 NEW cov: 12540 ft: 16086 corp: 27/1099b lim: 85 exec/s: 32 rss: 74Mb L: 35/85 MS: 1 ChangeBit- 00:08:49.033 [2024-11-18 14:21:44.987742] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:49.033 [2024-11-18 14:21:44.987769] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.033 [2024-11-18 14:21:44.987820] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:49.033 [2024-11-18 14:21:44.987835] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.033 [2024-11-18 14:21:44.987891] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:49.033 [2024-11-18 14:21:44.987906] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:49.033 [2024-11-18 14:21:44.987961] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:49.033 [2024-11-18 14:21:44.987976] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:49.033 #34 NEW cov: 12540 ft: 16090 corp: 28/1173b lim: 85 exec/s: 34 rss: 74Mb L: 74/85 MS: 2 EraseBytes-CrossOver- 00:08:49.033 [2024-11-18 14:21:45.027373] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:49.033 [2024-11-18 14:21:45.027399] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.033 #35 NEW cov: 12540 ft: 16110 corp: 29/1190b lim: 85 exec/s: 35 rss: 74Mb L: 17/85 MS: 1 ChangeByte- 00:08:49.033 [2024-11-18 14:21:45.068060] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:49.033 [2024-11-18 14:21:45.068086] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.033 [2024-11-18 14:21:45.068139] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:49.033 [2024-11-18 14:21:45.068154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.033 [2024-11-18 14:21:45.068208] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:49.033 [2024-11-18 14:21:45.068223] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:49.033 [2024-11-18 14:21:45.068276] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:49.033 [2024-11-18 14:21:45.068292] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:49.033 [2024-11-18 14:21:45.068346] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:4 nsid:0 00:08:49.033 [2024-11-18 14:21:45.068362] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:49.033 #36 NEW cov: 12540 ft: 16163 corp: 30/1275b lim: 85 exec/s: 36 rss: 74Mb L: 85/85 MS: 1 ShuffleBytes- 00:08:49.033 [2024-11-18 14:21:45.127953] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:49.033 [2024-11-18 14:21:45.127979] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.033 [2024-11-18 14:21:45.128026] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:49.033 [2024-11-18 14:21:45.128042] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.033 [2024-11-18 14:21:45.128098] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:49.033 [2024-11-18 14:21:45.128114] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:49.033 #37 NEW cov: 12540 ft: 16166 corp: 31/1334b lim: 85 exec/s: 37 rss: 74Mb L: 59/85 MS: 1 ChangeBinInt- 00:08:49.293 [2024-11-18 14:21:45.168077] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:49.293 [2024-11-18 14:21:45.168118] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.293 [2024-11-18 14:21:45.168170] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:49.293 [2024-11-18 14:21:45.168186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.294 [2024-11-18 14:21:45.168242] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:49.294 [2024-11-18 14:21:45.168258] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:49.294 #38 NEW cov: 12540 ft: 16181 corp: 32/1393b lim: 85 exec/s: 38 rss: 74Mb L: 59/85 MS: 1 CMP- DE: "\373\307\350\361\300\213\213\000"- 00:08:49.294 [2024-11-18 14:21:45.208213] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:49.294 [2024-11-18 14:21:45.208239] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.294 [2024-11-18 14:21:45.208298] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:49.294 [2024-11-18 14:21:45.208313] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.294 [2024-11-18 14:21:45.208374] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:49.294 [2024-11-18 14:21:45.208392] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:49.294 #39 NEW cov: 12540 ft: 16207 corp: 33/1449b lim: 85 exec/s: 39 rss: 74Mb L: 56/85 MS: 1 InsertByte- 00:08:49.294 [2024-11-18 14:21:45.247963] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:49.294 [2024-11-18 14:21:45.247988] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.294 #40 NEW cov: 12540 ft: 16248 corp: 34/1470b lim: 85 exec/s: 40 rss: 74Mb L: 21/85 MS: 1 CopyPart- 00:08:49.294 [2024-11-18 14:21:45.308801] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:49.294 [2024-11-18 14:21:45.308826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.294 [2024-11-18 14:21:45.308882] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:49.294 [2024-11-18 14:21:45.308898] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.294 [2024-11-18 14:21:45.308949] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:49.294 [2024-11-18 14:21:45.308965] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:49.294 [2024-11-18 14:21:45.309015] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:49.294 [2024-11-18 14:21:45.309030] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:49.294 [2024-11-18 14:21:45.309082] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:4 nsid:0 00:08:49.294 [2024-11-18 14:21:45.309096] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:49.294 #41 NEW cov: 12540 ft: 16255 corp: 35/1555b lim: 85 exec/s: 41 rss: 74Mb L: 85/85 MS: 1 CrossOver- 00:08:49.294 [2024-11-18 14:21:45.348295] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:49.294 [2024-11-18 14:21:45.348320] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.294 #42 NEW cov: 12540 ft: 16272 corp: 36/1587b lim: 85 exec/s: 42 rss: 74Mb L: 32/85 MS: 1 CopyPart- 00:08:49.294 [2024-11-18 14:21:45.388708] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:49.294 [2024-11-18 14:21:45.388734] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.294 [2024-11-18 14:21:45.388784] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:49.294 [2024-11-18 14:21:45.388800] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.294 [2024-11-18 14:21:45.388869] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:49.294 [2024-11-18 14:21:45.388884] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:49.294 #43 NEW cov: 12540 ft: 16281 corp: 37/1646b lim: 85 exec/s: 21 rss: 74Mb L: 59/85 MS: 1 ChangeBinInt- 00:08:49.294 #43 DONE cov: 12540 ft: 16281 corp: 37/1646b lim: 85 exec/s: 21 rss: 74Mb 00:08:49.294 ###### Recommended dictionary. ###### 00:08:49.294 "\377\001\000\000" # Uses: 1 00:08:49.294 "\001\000\377\377" # Uses: 1 00:08:49.294 "\373\307\350\361\300\213\213\000" # Uses: 0 00:08:49.294 ###### End of recommended dictionary. ###### 00:08:49.294 Done 43 runs in 2 second(s) 00:08:49.555 14:21:45 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_22.conf /var/tmp/suppress_nvmf_fuzz 00:08:49.555 14:21:45 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:49.555 14:21:45 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:49.555 14:21:45 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 23 1 0x1 00:08:49.555 14:21:45 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=23 00:08:49.555 14:21:45 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:49.555 14:21:45 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:49.555 14:21:45 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:08:49.555 14:21:45 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_23.conf 00:08:49.555 14:21:45 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:49.555 14:21:45 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:49.555 14:21:45 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 23 00:08:49.555 14:21:45 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4423 00:08:49.555 14:21:45 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:08:49.555 14:21:45 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4423' 00:08:49.555 14:21:45 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4423"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:49.555 14:21:45 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:49.555 14:21:45 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:49.555 14:21:45 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4423' -c /tmp/fuzz_json_23.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 -Z 23 00:08:49.555 [2024-11-18 14:21:45.556024] Starting SPDK v25.01-pre git sha1 d47eb51c9 / DPDK 23.11.0 initialization... 00:08:49.555 [2024-11-18 14:21:45.556099] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid332657 ] 00:08:49.816 [2024-11-18 14:21:45.835258] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:49.816 [2024-11-18 14:21:45.853631] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:49.816 [2024-11-18 14:21:45.906067] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:49.816 [2024-11-18 14:21:45.922394] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4423 *** 00:08:49.816 INFO: Running with entropic power schedule (0xFF, 100). 00:08:49.816 INFO: Seed: 1344128329 00:08:50.077 INFO: Loaded 1 modules (387495 inline 8-bit counters): 387495 [0x2acbe4c, 0x2b2a7f3), 00:08:50.077 INFO: Loaded 1 PC tables (387495 PCs): 387495 [0x2b2a7f8,0x3114268), 00:08:50.077 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:08:50.077 INFO: A corpus is not provided, starting from an empty corpus 00:08:50.077 #2 INITED exec/s: 0 rss: 64Mb 00:08:50.077 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:50.077 This may also happen if the target rejected all inputs we tried so far 00:08:50.077 [2024-11-18 14:21:45.988888] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:50.077 [2024-11-18 14:21:45.988932] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:50.338 NEW_FUNC[1/716]: 0x4840b8 in fuzz_nvm_reservation_report_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:671 00:08:50.338 NEW_FUNC[2/716]: 0x495e28 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:50.338 #4 NEW cov: 12246 ft: 12247 corp: 2/10b lim: 25 exec/s: 0 rss: 72Mb L: 9/9 MS: 2 ChangeByte-CMP- DE: "\001\000\000\000\000\000\000\005"- 00:08:50.338 [2024-11-18 14:21:46.329525] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:50.338 [2024-11-18 14:21:46.329565] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:50.338 [2024-11-18 14:21:46.329686] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:50.338 [2024-11-18 14:21:46.329710] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:50.338 [2024-11-18 14:21:46.329834] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:50.338 [2024-11-18 14:21:46.329856] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:50.338 #5 NEW cov: 12359 ft: 13214 corp: 3/27b lim: 25 exec/s: 0 rss: 72Mb L: 17/17 MS: 1 PersAutoDict- DE: "\001\000\000\000\000\000\000\005"- 00:08:50.338 [2024-11-18 14:21:46.389330] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:50.338 [2024-11-18 14:21:46.389356] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:50.338 #6 NEW cov: 12365 ft: 13468 corp: 4/36b lim: 25 exec/s: 0 rss: 72Mb L: 9/17 MS: 1 ChangeBit- 00:08:50.338 [2024-11-18 14:21:46.429453] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:50.338 [2024-11-18 14:21:46.429485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:50.338 [2024-11-18 14:21:46.429612] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:50.338 [2024-11-18 14:21:46.429637] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:50.338 #15 NEW cov: 12450 ft: 13891 corp: 5/47b lim: 25 exec/s: 0 rss: 72Mb L: 11/17 MS: 4 CopyPart-InsertByte-InsertByte-CMP- DE: "\000\213\213\301\256b\356("- 00:08:50.598 [2024-11-18 14:21:46.470020] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:50.598 [2024-11-18 14:21:46.470053] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:50.598 [2024-11-18 14:21:46.470170] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:50.598 [2024-11-18 14:21:46.470192] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:50.598 [2024-11-18 14:21:46.470312] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:50.598 [2024-11-18 14:21:46.470339] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:50.598 [2024-11-18 14:21:46.470467] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:50.598 [2024-11-18 14:21:46.470488] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:50.598 #16 NEW cov: 12450 ft: 14359 corp: 6/69b lim: 25 exec/s: 0 rss: 72Mb L: 22/22 MS: 1 InsertRepeatedBytes- 00:08:50.599 [2024-11-18 14:21:46.530029] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:50.599 [2024-11-18 14:21:46.530063] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:50.599 [2024-11-18 14:21:46.530172] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:50.599 [2024-11-18 14:21:46.530197] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:50.599 [2024-11-18 14:21:46.530328] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:50.599 [2024-11-18 14:21:46.530350] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:50.599 #17 NEW cov: 12450 ft: 14431 corp: 7/86b lim: 25 exec/s: 0 rss: 72Mb L: 17/22 MS: 1 ChangeByte- 00:08:50.599 [2024-11-18 14:21:46.589799] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:50.599 [2024-11-18 14:21:46.589832] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:50.599 #18 NEW cov: 12450 ft: 14622 corp: 8/95b lim: 25 exec/s: 0 rss: 72Mb L: 9/22 MS: 1 EraseBytes- 00:08:50.599 [2024-11-18 14:21:46.640514] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:50.599 [2024-11-18 14:21:46.640546] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:50.599 [2024-11-18 14:21:46.640649] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:50.599 [2024-11-18 14:21:46.640668] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:50.599 [2024-11-18 14:21:46.640795] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:50.599 [2024-11-18 14:21:46.640820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:50.599 [2024-11-18 14:21:46.640948] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:50.599 [2024-11-18 14:21:46.640971] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:50.599 #19 NEW cov: 12450 ft: 14751 corp: 9/118b lim: 25 exec/s: 0 rss: 73Mb L: 23/23 MS: 1 InsertByte- 00:08:50.599 [2024-11-18 14:21:46.700535] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:50.599 [2024-11-18 14:21:46.700569] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:50.599 [2024-11-18 14:21:46.700667] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:50.599 [2024-11-18 14:21:46.700689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:50.599 [2024-11-18 14:21:46.700814] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:50.599 [2024-11-18 14:21:46.700833] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:50.599 #20 NEW cov: 12450 ft: 14782 corp: 10/133b lim: 25 exec/s: 0 rss: 73Mb L: 15/23 MS: 1 InsertRepeatedBytes- 00:08:50.859 [2024-11-18 14:21:46.740251] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:50.859 [2024-11-18 14:21:46.740275] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:50.859 #21 NEW cov: 12450 ft: 14900 corp: 11/142b lim: 25 exec/s: 0 rss: 73Mb L: 9/23 MS: 1 CrossOver- 00:08:50.859 [2024-11-18 14:21:46.780880] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:50.859 [2024-11-18 14:21:46.780909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:50.859 [2024-11-18 14:21:46.781000] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:50.859 [2024-11-18 14:21:46.781023] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:50.859 [2024-11-18 14:21:46.781146] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:50.859 [2024-11-18 14:21:46.781170] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:50.859 [2024-11-18 14:21:46.781286] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:50.859 [2024-11-18 14:21:46.781309] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:50.859 #22 NEW cov: 12450 ft: 14932 corp: 12/166b lim: 25 exec/s: 0 rss: 73Mb L: 24/24 MS: 1 InsertByte- 00:08:50.859 [2024-11-18 14:21:46.851149] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:50.859 [2024-11-18 14:21:46.851181] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:50.859 [2024-11-18 14:21:46.851263] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:50.859 [2024-11-18 14:21:46.851285] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:50.859 [2024-11-18 14:21:46.851422] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:50.860 [2024-11-18 14:21:46.851447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:50.860 [2024-11-18 14:21:46.851576] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:50.860 [2024-11-18 14:21:46.851596] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:50.860 NEW_FUNC[1/1]: 0x1c52b68 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:50.860 #23 NEW cov: 12473 ft: 14992 corp: 13/190b lim: 25 exec/s: 0 rss: 73Mb L: 24/24 MS: 1 CopyPart- 00:08:50.860 [2024-11-18 14:21:46.910981] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:50.860 [2024-11-18 14:21:46.911014] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:50.860 [2024-11-18 14:21:46.911134] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:50.860 [2024-11-18 14:21:46.911159] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:50.860 [2024-11-18 14:21:46.911296] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:50.860 [2024-11-18 14:21:46.911322] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:50.860 #24 NEW cov: 12473 ft: 15076 corp: 14/207b lim: 25 exec/s: 0 rss: 73Mb L: 17/24 MS: 1 PersAutoDict- DE: "\000\213\213\301\256b\356("- 00:08:50.860 [2024-11-18 14:21:46.981061] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:50.860 [2024-11-18 14:21:46.981092] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:50.860 [2024-11-18 14:21:46.981204] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:50.860 [2024-11-18 14:21:46.981227] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.120 #25 NEW cov: 12473 ft: 15110 corp: 15/218b lim: 25 exec/s: 25 rss: 73Mb L: 11/24 MS: 1 ShuffleBytes- 00:08:51.120 [2024-11-18 14:21:47.040992] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:51.120 [2024-11-18 14:21:47.041021] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.120 #26 NEW cov: 12473 ft: 15205 corp: 16/227b lim: 25 exec/s: 26 rss: 73Mb L: 9/24 MS: 1 ShuffleBytes- 00:08:51.120 [2024-11-18 14:21:47.081310] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:51.120 [2024-11-18 14:21:47.081343] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.120 [2024-11-18 14:21:47.081472] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:51.120 [2024-11-18 14:21:47.081492] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.120 #27 NEW cov: 12473 ft: 15236 corp: 17/239b lim: 25 exec/s: 27 rss: 73Mb L: 12/24 MS: 1 EraseBytes- 00:08:51.120 [2024-11-18 14:21:47.121314] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:51.120 [2024-11-18 14:21:47.121338] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.120 #28 NEW cov: 12473 ft: 15246 corp: 18/248b lim: 25 exec/s: 28 rss: 73Mb L: 9/24 MS: 1 ShuffleBytes- 00:08:51.120 [2024-11-18 14:21:47.192053] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:51.120 [2024-11-18 14:21:47.192086] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.121 [2024-11-18 14:21:47.192204] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:51.121 [2024-11-18 14:21:47.192230] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.121 [2024-11-18 14:21:47.192344] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:51.121 [2024-11-18 14:21:47.192367] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:51.121 [2024-11-18 14:21:47.192485] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:51.121 [2024-11-18 14:21:47.192508] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:51.121 #29 NEW cov: 12473 ft: 15293 corp: 19/271b lim: 25 exec/s: 29 rss: 73Mb L: 23/24 MS: 1 CMP- DE: "\377\377\377\377\377\377\377\002"- 00:08:51.121 [2024-11-18 14:21:47.231877] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:51.121 [2024-11-18 14:21:47.231903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.121 [2024-11-18 14:21:47.232026] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:51.121 [2024-11-18 14:21:47.232048] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.382 #30 NEW cov: 12473 ft: 15308 corp: 20/281b lim: 25 exec/s: 30 rss: 73Mb L: 10/24 MS: 1 InsertByte- 00:08:51.382 [2024-11-18 14:21:47.292306] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:51.382 [2024-11-18 14:21:47.292337] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.382 [2024-11-18 14:21:47.292426] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:51.382 [2024-11-18 14:21:47.292445] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.382 [2024-11-18 14:21:47.292562] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:51.382 [2024-11-18 14:21:47.292590] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:51.382 [2024-11-18 14:21:47.292718] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:51.382 [2024-11-18 14:21:47.292737] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:51.382 #31 NEW cov: 12473 ft: 15320 corp: 21/304b lim: 25 exec/s: 31 rss: 73Mb L: 23/24 MS: 1 ChangeBit- 00:08:51.382 [2024-11-18 14:21:47.332022] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:51.382 [2024-11-18 14:21:47.332055] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.382 [2024-11-18 14:21:47.332180] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:51.382 [2024-11-18 14:21:47.332202] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.382 #32 NEW cov: 12473 ft: 15334 corp: 22/316b lim: 25 exec/s: 32 rss: 74Mb L: 12/24 MS: 1 ChangeBinInt- 00:08:51.382 [2024-11-18 14:21:47.392423] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:51.382 [2024-11-18 14:21:47.392453] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.382 [2024-11-18 14:21:47.392582] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:51.382 [2024-11-18 14:21:47.392606] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.382 [2024-11-18 14:21:47.392727] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:51.382 [2024-11-18 14:21:47.392758] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:51.382 #33 NEW cov: 12473 ft: 15344 corp: 23/333b lim: 25 exec/s: 33 rss: 74Mb L: 17/24 MS: 1 CopyPart- 00:08:51.382 [2024-11-18 14:21:47.432593] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:51.382 [2024-11-18 14:21:47.432623] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.382 [2024-11-18 14:21:47.432694] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:51.382 [2024-11-18 14:21:47.432715] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.382 [2024-11-18 14:21:47.432842] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:51.382 [2024-11-18 14:21:47.432863] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:51.382 #34 NEW cov: 12473 ft: 15370 corp: 24/350b lim: 25 exec/s: 34 rss: 74Mb L: 17/24 MS: 1 PersAutoDict- DE: "\377\377\377\377\377\377\377\002"- 00:08:51.382 [2024-11-18 14:21:47.482927] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:51.382 [2024-11-18 14:21:47.482960] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.382 [2024-11-18 14:21:47.483060] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:51.382 [2024-11-18 14:21:47.483081] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.382 [2024-11-18 14:21:47.483194] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:51.382 [2024-11-18 14:21:47.483215] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:51.382 [2024-11-18 14:21:47.483338] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:51.382 [2024-11-18 14:21:47.483361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:51.642 #35 NEW cov: 12473 ft: 15395 corp: 25/372b lim: 25 exec/s: 35 rss: 74Mb L: 22/24 MS: 1 InsertRepeatedBytes- 00:08:51.642 [2024-11-18 14:21:47.543010] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:51.642 [2024-11-18 14:21:47.543041] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.642 [2024-11-18 14:21:47.543125] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:51.642 [2024-11-18 14:21:47.543148] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.642 [2024-11-18 14:21:47.543270] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:51.642 [2024-11-18 14:21:47.543291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:51.642 [2024-11-18 14:21:47.543421] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:51.642 [2024-11-18 14:21:47.543438] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:51.642 #36 NEW cov: 12473 ft: 15417 corp: 26/393b lim: 25 exec/s: 36 rss: 74Mb L: 21/24 MS: 1 CrossOver- 00:08:51.642 [2024-11-18 14:21:47.583060] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:51.642 [2024-11-18 14:21:47.583091] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.642 [2024-11-18 14:21:47.583204] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:51.642 [2024-11-18 14:21:47.583226] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.642 [2024-11-18 14:21:47.583350] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:51.642 [2024-11-18 14:21:47.583369] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:51.642 #37 NEW cov: 12473 ft: 15479 corp: 27/410b lim: 25 exec/s: 37 rss: 74Mb L: 17/24 MS: 1 PersAutoDict- DE: "\377\377\377\377\377\377\377\002"- 00:08:51.642 [2024-11-18 14:21:47.653358] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:51.642 [2024-11-18 14:21:47.653387] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.642 [2024-11-18 14:21:47.653493] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:51.642 [2024-11-18 14:21:47.653515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.642 [2024-11-18 14:21:47.653651] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:51.642 [2024-11-18 14:21:47.653675] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:51.642 [2024-11-18 14:21:47.653795] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:51.642 [2024-11-18 14:21:47.653819] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:51.642 #38 NEW cov: 12473 ft: 15509 corp: 28/434b lim: 25 exec/s: 38 rss: 74Mb L: 24/24 MS: 1 PersAutoDict- DE: "\377\377\377\377\377\377\377\002"- 00:08:51.642 [2024-11-18 14:21:47.723618] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:51.642 [2024-11-18 14:21:47.723654] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.642 [2024-11-18 14:21:47.723742] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:51.642 [2024-11-18 14:21:47.723766] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.642 [2024-11-18 14:21:47.723892] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:51.642 [2024-11-18 14:21:47.723916] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:51.642 [2024-11-18 14:21:47.724038] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:51.642 [2024-11-18 14:21:47.724060] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:51.642 #39 NEW cov: 12473 ft: 15517 corp: 29/455b lim: 25 exec/s: 39 rss: 74Mb L: 21/24 MS: 1 InsertRepeatedBytes- 00:08:51.642 [2024-11-18 14:21:47.763885] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:51.642 [2024-11-18 14:21:47.763915] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.642 [2024-11-18 14:21:47.764025] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:51.642 [2024-11-18 14:21:47.764047] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.642 [2024-11-18 14:21:47.764182] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:51.643 [2024-11-18 14:21:47.764207] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:51.643 [2024-11-18 14:21:47.764328] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:51.643 [2024-11-18 14:21:47.764352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:51.643 [2024-11-18 14:21:47.764473] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:51.643 [2024-11-18 14:21:47.764490] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:51.903 #40 NEW cov: 12473 ft: 15557 corp: 30/480b lim: 25 exec/s: 40 rss: 74Mb L: 25/25 MS: 1 CrossOver- 00:08:51.903 [2024-11-18 14:21:47.823912] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:51.903 [2024-11-18 14:21:47.823941] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.903 [2024-11-18 14:21:47.824037] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:51.903 [2024-11-18 14:21:47.824060] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.903 [2024-11-18 14:21:47.824175] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:51.903 [2024-11-18 14:21:47.824198] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:51.903 [2024-11-18 14:21:47.824320] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:51.903 [2024-11-18 14:21:47.824339] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:51.903 #41 NEW cov: 12473 ft: 15561 corp: 31/501b lim: 25 exec/s: 41 rss: 74Mb L: 21/25 MS: 1 EraseBytes- 00:08:51.903 [2024-11-18 14:21:47.883515] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:51.903 [2024-11-18 14:21:47.883546] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.903 #42 NEW cov: 12473 ft: 15562 corp: 32/510b lim: 25 exec/s: 42 rss: 74Mb L: 9/25 MS: 1 ChangeByte- 00:08:51.903 [2024-11-18 14:21:47.924157] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:51.903 [2024-11-18 14:21:47.924187] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.903 [2024-11-18 14:21:47.924280] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:51.903 [2024-11-18 14:21:47.924299] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.903 [2024-11-18 14:21:47.924410] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:51.903 [2024-11-18 14:21:47.924428] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:51.903 [2024-11-18 14:21:47.924553] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:51.903 [2024-11-18 14:21:47.924576] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:51.903 #43 NEW cov: 12473 ft: 15573 corp: 33/534b lim: 25 exec/s: 21 rss: 74Mb L: 24/25 MS: 1 ChangeBit- 00:08:51.903 #43 DONE cov: 12473 ft: 15573 corp: 33/534b lim: 25 exec/s: 21 rss: 74Mb 00:08:51.903 ###### Recommended dictionary. ###### 00:08:51.903 "\001\000\000\000\000\000\000\005" # Uses: 1 00:08:51.903 "\000\213\213\301\256b\356(" # Uses: 1 00:08:51.903 "\377\377\377\377\377\377\377\002" # Uses: 3 00:08:51.903 ###### End of recommended dictionary. ###### 00:08:51.903 Done 43 runs in 2 second(s) 00:08:52.164 14:21:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_23.conf /var/tmp/suppress_nvmf_fuzz 00:08:52.164 14:21:48 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:52.164 14:21:48 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:52.164 14:21:48 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 24 1 0x1 00:08:52.164 14:21:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=24 00:08:52.164 14:21:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:52.164 14:21:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:52.164 14:21:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:08:52.164 14:21:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_24.conf 00:08:52.164 14:21:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:52.164 14:21:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:52.164 14:21:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 24 00:08:52.164 14:21:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4424 00:08:52.164 14:21:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:08:52.164 14:21:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4424' 00:08:52.164 14:21:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4424"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:52.164 14:21:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:52.164 14:21:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:52.164 14:21:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4424' -c /tmp/fuzz_json_24.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 -Z 24 00:08:52.164 [2024-11-18 14:21:48.107922] Starting SPDK v25.01-pre git sha1 d47eb51c9 / DPDK 23.11.0 initialization... 00:08:52.164 [2024-11-18 14:21:48.107992] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid333196 ] 00:08:52.425 [2024-11-18 14:21:48.383052] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:52.425 [2024-11-18 14:21:48.404724] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:52.425 [2024-11-18 14:21:48.456922] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:52.425 [2024-11-18 14:21:48.473242] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4424 *** 00:08:52.425 INFO: Running with entropic power schedule (0xFF, 100). 00:08:52.425 INFO: Seed: 3895095543 00:08:52.425 INFO: Loaded 1 modules (387495 inline 8-bit counters): 387495 [0x2acbe4c, 0x2b2a7f3), 00:08:52.425 INFO: Loaded 1 PC tables (387495 PCs): 387495 [0x2b2a7f8,0x3114268), 00:08:52.425 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:08:52.425 INFO: A corpus is not provided, starting from an empty corpus 00:08:52.425 #2 INITED exec/s: 0 rss: 65Mb 00:08:52.425 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:52.425 This may also happen if the target rejected all inputs we tried so far 00:08:52.425 [2024-11-18 14:21:48.518145] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18012703036530096633 len:63994 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:52.425 [2024-11-18 14:21:48.518178] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.425 [2024-11-18 14:21:48.518227] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18012703036681091577 len:63994 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:52.425 [2024-11-18 14:21:48.518245] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:52.425 [2024-11-18 14:21:48.518274] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18012703036681091577 len:63994 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:52.425 [2024-11-18 14:21:48.518290] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:52.425 [2024-11-18 14:21:48.518318] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18012703036681091577 len:63994 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:52.425 [2024-11-18 14:21:48.518334] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:52.945 NEW_FUNC[1/716]: 0x4851a8 in fuzz_nvm_compare_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:685 00:08:52.945 NEW_FUNC[2/716]: 0x495e28 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:52.945 #6 NEW cov: 12317 ft: 12317 corp: 2/100b lim: 100 exec/s: 0 rss: 73Mb L: 99/99 MS: 4 ChangeBinInt-CrossOver-CopyPart-InsertRepeatedBytes- 00:08:52.945 [2024-11-18 14:21:48.889069] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18012703036530096633 len:63994 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:52.946 [2024-11-18 14:21:48.889106] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.946 [2024-11-18 14:21:48.889155] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18012703036681091577 len:63994 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:52.946 [2024-11-18 14:21:48.889173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:52.946 [2024-11-18 14:21:48.889207] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18012703036681091577 len:63994 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:52.946 [2024-11-18 14:21:48.889223] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:52.946 [2024-11-18 14:21:48.889251] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18012703036681091577 len:63994 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:52.946 [2024-11-18 14:21:48.889267] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:52.946 NEW_FUNC[1/1]: 0x1003398 in rte_rdtsc /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/rte_cycles.h:31 00:08:52.946 #7 NEW cov: 12431 ft: 12930 corp: 3/183b lim: 100 exec/s: 0 rss: 73Mb L: 83/99 MS: 1 EraseBytes- 00:08:52.946 [2024-11-18 14:21:48.980038] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18012703036530096633 len:63994 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:52.946 [2024-11-18 14:21:48.980067] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.946 [2024-11-18 14:21:48.980130] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18012703036681091577 len:63994 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:52.946 [2024-11-18 14:21:48.980147] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:52.946 [2024-11-18 14:21:48.980202] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18012703036681091577 len:63994 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:52.946 [2024-11-18 14:21:48.980218] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:52.946 [2024-11-18 14:21:48.980275] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18012703036681091577 len:63994 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:52.946 [2024-11-18 14:21:48.980291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:52.946 #8 NEW cov: 12437 ft: 13204 corp: 4/266b lim: 100 exec/s: 0 rss: 73Mb L: 83/99 MS: 1 ShuffleBytes- 00:08:52.946 [2024-11-18 14:21:49.040247] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18012703036530096633 len:63994 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:52.946 [2024-11-18 14:21:49.040273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.946 [2024-11-18 14:21:49.040337] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18012703036681091577 len:63994 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:52.946 [2024-11-18 14:21:49.040352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:52.946 [2024-11-18 14:21:49.040409] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18012703036681091577 len:63994 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:52.946 [2024-11-18 14:21:49.040426] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:52.946 [2024-11-18 14:21:49.040482] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18012703036681091577 len:63994 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:52.946 [2024-11-18 14:21:49.040497] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:53.207 #9 NEW cov: 12522 ft: 13404 corp: 5/349b lim: 100 exec/s: 0 rss: 73Mb L: 83/99 MS: 1 ChangeBinInt- 00:08:53.207 [2024-11-18 14:21:49.100065] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18012703036530096633 len:63994 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.207 [2024-11-18 14:21:49.100096] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:53.207 [2024-11-18 14:21:49.100149] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18012703036681091577 len:63994 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.207 [2024-11-18 14:21:49.100167] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:53.207 #10 NEW cov: 12522 ft: 14075 corp: 6/391b lim: 100 exec/s: 0 rss: 73Mb L: 42/99 MS: 1 EraseBytes- 00:08:53.207 [2024-11-18 14:21:49.140010] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65296 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.207 [2024-11-18 14:21:49.140037] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:53.207 #13 NEW cov: 12522 ft: 14954 corp: 7/411b lim: 100 exec/s: 0 rss: 73Mb L: 20/99 MS: 3 CMP-CopyPart-CopyPart- DE: "\377\377\377\377\377\377\377\017"- 00:08:53.207 [2024-11-18 14:21:49.180564] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18012703036530096633 len:63994 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.207 [2024-11-18 14:21:49.180608] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:53.207 [2024-11-18 14:21:49.180661] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18012703036681091577 len:63994 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.207 [2024-11-18 14:21:49.180677] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:53.207 [2024-11-18 14:21:49.180734] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18012703036681091577 len:63994 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.207 [2024-11-18 14:21:49.180750] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:53.207 [2024-11-18 14:21:49.180804] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18012703036681091579 len:63994 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.207 [2024-11-18 14:21:49.180818] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:53.207 #14 NEW cov: 12522 ft: 15030 corp: 8/510b lim: 100 exec/s: 0 rss: 73Mb L: 99/99 MS: 1 ChangeBit- 00:08:53.207 [2024-11-18 14:21:49.220262] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65296 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.207 [2024-11-18 14:21:49.220291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:53.207 #15 NEW cov: 12522 ft: 15081 corp: 9/530b lim: 100 exec/s: 0 rss: 73Mb L: 20/99 MS: 1 ChangeBit- 00:08:53.207 [2024-11-18 14:21:49.280894] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18012703036530096633 len:63994 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.207 [2024-11-18 14:21:49.280921] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:53.207 [2024-11-18 14:21:49.280988] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18012703036681091577 len:63994 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.207 [2024-11-18 14:21:49.281004] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:53.207 [2024-11-18 14:21:49.281060] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18012703036681091577 len:63994 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.207 [2024-11-18 14:21:49.281075] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:53.207 [2024-11-18 14:21:49.281134] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18012703036681091577 len:63752 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.207 [2024-11-18 14:21:49.281150] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:53.207 #16 NEW cov: 12522 ft: 15143 corp: 10/613b lim: 100 exec/s: 0 rss: 73Mb L: 83/99 MS: 1 ChangeBinInt- 00:08:53.207 [2024-11-18 14:21:49.320720] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18012702491069250041 len:63994 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.207 [2024-11-18 14:21:49.320747] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:53.207 [2024-11-18 14:21:49.320786] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18012703036681091577 len:63994 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.207 [2024-11-18 14:21:49.320803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:53.467 #17 NEW cov: 12522 ft: 15186 corp: 11/656b lim: 100 exec/s: 0 rss: 73Mb L: 43/99 MS: 1 InsertByte- 00:08:53.467 [2024-11-18 14:21:49.380887] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:790937516004473337 len:63994 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.467 [2024-11-18 14:21:49.380915] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:53.467 [2024-11-18 14:21:49.380969] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18012703036681091577 len:63994 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.467 [2024-11-18 14:21:49.380984] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:53.467 NEW_FUNC[1/1]: 0x1c52b68 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:53.467 #18 NEW cov: 12539 ft: 15259 corp: 12/699b lim: 100 exec/s: 0 rss: 73Mb L: 43/99 MS: 1 ChangeBinInt- 00:08:53.467 [2024-11-18 14:21:49.441034] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18012702491069250041 len:63994 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.467 [2024-11-18 14:21:49.441062] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:53.467 [2024-11-18 14:21:49.441101] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18012703036681091577 len:63994 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.467 [2024-11-18 14:21:49.441118] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:53.467 #19 NEW cov: 12539 ft: 15353 corp: 13/742b lim: 100 exec/s: 0 rss: 73Mb L: 43/99 MS: 1 ShuffleBytes- 00:08:53.467 [2024-11-18 14:21:49.481416] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18012703036530096633 len:63994 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.467 [2024-11-18 14:21:49.481443] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:53.467 [2024-11-18 14:21:49.481495] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18012703036681091577 len:63994 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.467 [2024-11-18 14:21:49.481512] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:53.468 [2024-11-18 14:21:49.481571] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18012703036681091577 len:63994 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.468 [2024-11-18 14:21:49.481587] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:53.468 [2024-11-18 14:21:49.481644] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18012703036681091577 len:63994 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.468 [2024-11-18 14:21:49.481664] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:53.468 #20 NEW cov: 12539 ft: 15378 corp: 14/825b lim: 100 exec/s: 0 rss: 73Mb L: 83/99 MS: 1 ChangeBinInt- 00:08:53.468 [2024-11-18 14:21:49.521373] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18012702491069250041 len:63994 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.468 [2024-11-18 14:21:49.521399] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:53.468 [2024-11-18 14:21:49.521453] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18012703036681091577 len:63994 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.468 [2024-11-18 14:21:49.521469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:53.468 [2024-11-18 14:21:49.521526] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18012703036681091577 len:63994 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.468 [2024-11-18 14:21:49.521541] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:53.468 #21 NEW cov: 12539 ft: 15682 corp: 15/893b lim: 100 exec/s: 21 rss: 73Mb L: 68/99 MS: 1 CrossOver- 00:08:53.468 [2024-11-18 14:21:49.561170] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65296 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.468 [2024-11-18 14:21:49.561197] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:53.728 #22 NEW cov: 12539 ft: 15774 corp: 16/913b lim: 100 exec/s: 22 rss: 73Mb L: 20/99 MS: 1 ChangeBit- 00:08:53.728 [2024-11-18 14:21:49.621367] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65296 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.728 [2024-11-18 14:21:49.621394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:53.728 #23 NEW cov: 12539 ft: 15810 corp: 17/934b lim: 100 exec/s: 23 rss: 73Mb L: 21/99 MS: 1 InsertByte- 00:08:53.728 [2024-11-18 14:21:49.661629] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:790937516004473337 len:63994 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.728 [2024-11-18 14:21:49.661656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:53.728 [2024-11-18 14:21:49.661727] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18012703036681091577 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.728 [2024-11-18 14:21:49.661744] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:53.728 #24 NEW cov: 12539 ft: 15827 corp: 18/977b lim: 100 exec/s: 24 rss: 74Mb L: 43/99 MS: 1 PersAutoDict- DE: "\377\377\377\377\377\377\377\017"- 00:08:53.728 [2024-11-18 14:21:49.721806] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18012703036530096633 len:63994 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.728 [2024-11-18 14:21:49.721831] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:53.728 [2024-11-18 14:21:49.721884] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:4193856000 len:63994 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.728 [2024-11-18 14:21:49.721901] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:53.728 #25 NEW cov: 12539 ft: 15837 corp: 19/1019b lim: 100 exec/s: 25 rss: 74Mb L: 42/99 MS: 1 ChangeBinInt- 00:08:53.728 [2024-11-18 14:21:49.762216] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18012703036530096633 len:63994 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.728 [2024-11-18 14:21:49.762244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:53.728 [2024-11-18 14:21:49.762299] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18012703036681091577 len:63994 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.728 [2024-11-18 14:21:49.762315] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:53.728 [2024-11-18 14:21:49.762370] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18012703036681091577 len:63994 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.728 [2024-11-18 14:21:49.762387] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:53.728 [2024-11-18 14:21:49.762443] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18012703036681091577 len:63752 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.728 [2024-11-18 14:21:49.762458] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:53.728 #26 NEW cov: 12539 ft: 15850 corp: 20/1102b lim: 100 exec/s: 26 rss: 74Mb L: 83/99 MS: 1 ChangeBinInt- 00:08:53.728 [2024-11-18 14:21:49.822365] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18012703036530096633 len:63994 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.728 [2024-11-18 14:21:49.822392] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:53.728 [2024-11-18 14:21:49.822464] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18012703036681091577 len:63994 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.728 [2024-11-18 14:21:49.822481] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:53.728 [2024-11-18 14:21:49.822532] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18012703036681091577 len:64000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.728 [2024-11-18 14:21:49.822546] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:53.728 [2024-11-18 14:21:49.822607] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18012703036681091577 len:63994 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.728 [2024-11-18 14:21:49.822633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:53.728 #27 NEW cov: 12539 ft: 15870 corp: 21/1191b lim: 100 exec/s: 27 rss: 74Mb L: 89/99 MS: 1 InsertRepeatedBytes- 00:08:53.989 [2024-11-18 14:21:49.862688] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18012703036530096633 len:63994 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.989 [2024-11-18 14:21:49.862716] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:53.989 [2024-11-18 14:21:49.862773] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18012703036681091577 len:63994 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.989 [2024-11-18 14:21:49.862789] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:53.989 [2024-11-18 14:21:49.862842] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18012703036681091577 len:63994 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.989 [2024-11-18 14:21:49.862856] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:53.989 [2024-11-18 14:21:49.862913] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18156818224756947449 len:63994 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.989 [2024-11-18 14:21:49.862929] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:53.989 [2024-11-18 14:21:49.862984] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:18012703036681091577 len:63994 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.989 [2024-11-18 14:21:49.863000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:53.989 #28 NEW cov: 12539 ft: 15930 corp: 22/1291b lim: 100 exec/s: 28 rss: 74Mb L: 100/100 MS: 1 CopyPart- 00:08:53.989 [2024-11-18 14:21:49.922661] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18012703036530096633 len:63994 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.989 [2024-11-18 14:21:49.922688] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:53.989 [2024-11-18 14:21:49.922740] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18012703036672702969 len:63994 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.989 [2024-11-18 14:21:49.922756] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:53.989 [2024-11-18 14:21:49.922810] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18012703036681091577 len:63994 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.989 [2024-11-18 14:21:49.922826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:53.989 [2024-11-18 14:21:49.922878] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18012703036681091577 len:63752 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.989 [2024-11-18 14:21:49.922894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:53.989 #29 NEW cov: 12539 ft: 15952 corp: 23/1374b lim: 100 exec/s: 29 rss: 74Mb L: 83/100 MS: 1 ChangeBit- 00:08:53.989 [2024-11-18 14:21:49.982833] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18012703036530096633 len:63994 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.989 [2024-11-18 14:21:49.982859] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:53.989 [2024-11-18 14:21:49.982915] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18012703036672702969 len:63994 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.989 [2024-11-18 14:21:49.982931] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:53.989 [2024-11-18 14:21:49.982999] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18012703036681091577 len:63994 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.989 [2024-11-18 14:21:49.983017] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:53.989 [2024-11-18 14:21:49.983072] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:1151226031806341119 len:63752 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.989 [2024-11-18 14:21:49.983088] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:53.989 #30 NEW cov: 12539 ft: 15954 corp: 24/1457b lim: 100 exec/s: 30 rss: 74Mb L: 83/100 MS: 1 CrossOver- 00:08:53.989 [2024-11-18 14:21:50.042734] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18012702491069250041 len:63994 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.990 [2024-11-18 14:21:50.042762] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:53.990 [2024-11-18 14:21:50.042813] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18012703036681091577 len:63994 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.990 [2024-11-18 14:21:50.042830] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:53.990 #31 NEW cov: 12539 ft: 16027 corp: 25/1500b lim: 100 exec/s: 31 rss: 74Mb L: 43/100 MS: 1 ChangeByte- 00:08:53.990 [2024-11-18 14:21:50.103189] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18012703036530096633 len:63994 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.990 [2024-11-18 14:21:50.103219] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:53.990 [2024-11-18 14:21:50.103268] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18012703036672702969 len:63994 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.990 [2024-11-18 14:21:50.103283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:53.990 [2024-11-18 14:21:50.103337] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18012703036681091577 len:63994 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.990 [2024-11-18 14:21:50.103352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:53.990 [2024-11-18 14:21:50.103407] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18012703036681091577 len:63752 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.990 [2024-11-18 14:21:50.103421] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:54.250 #32 NEW cov: 12539 ft: 16047 corp: 26/1583b lim: 100 exec/s: 32 rss: 74Mb L: 83/100 MS: 1 ChangeBit- 00:08:54.250 [2024-11-18 14:21:50.143335] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18012703036530096633 len:63998 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.250 [2024-11-18 14:21:50.143362] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.250 [2024-11-18 14:21:50.143411] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18012703036681091577 len:63994 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.250 [2024-11-18 14:21:50.143427] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:54.250 [2024-11-18 14:21:50.143481] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18012703036681091577 len:63994 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.250 [2024-11-18 14:21:50.143497] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:54.250 [2024-11-18 14:21:50.143555] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18012703036681091577 len:63752 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.250 [2024-11-18 14:21:50.143570] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:54.250 #33 NEW cov: 12539 ft: 16070 corp: 27/1666b lim: 100 exec/s: 33 rss: 74Mb L: 83/100 MS: 1 ChangeBit- 00:08:54.250 [2024-11-18 14:21:50.183577] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18012703036530096633 len:63994 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.250 [2024-11-18 14:21:50.183603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.250 [2024-11-18 14:21:50.183676] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18012703036681091577 len:63994 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.250 [2024-11-18 14:21:50.183693] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:54.250 [2024-11-18 14:21:50.183752] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18012703036681091577 len:63994 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.250 [2024-11-18 14:21:50.183766] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:54.250 [2024-11-18 14:21:50.183818] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18156818224756947449 len:63994 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.250 [2024-11-18 14:21:50.183834] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:54.250 [2024-11-18 14:21:50.183889] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:18012703036681091577 len:63994 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.250 [2024-11-18 14:21:50.183905] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:54.250 #34 NEW cov: 12539 ft: 16107 corp: 28/1766b lim: 100 exec/s: 34 rss: 74Mb L: 100/100 MS: 1 ShuffleBytes- 00:08:54.250 [2024-11-18 14:21:50.243283] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18012702491069250041 len:63994 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.250 [2024-11-18 14:21:50.243309] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.250 [2024-11-18 14:21:50.243347] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18012703036681091577 len:63994 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.250 [2024-11-18 14:21:50.243363] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:54.250 #35 NEW cov: 12539 ft: 16112 corp: 29/1809b lim: 100 exec/s: 35 rss: 74Mb L: 43/100 MS: 1 CrossOver- 00:08:54.250 [2024-11-18 14:21:50.283230] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18012702491069250041 len:63994 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.251 [2024-11-18 14:21:50.283258] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.251 #36 NEW cov: 12539 ft: 16166 corp: 30/1846b lim: 100 exec/s: 36 rss: 74Mb L: 37/100 MS: 1 EraseBytes- 00:08:54.251 [2024-11-18 14:21:50.343884] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18012703036530096623 len:63994 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.251 [2024-11-18 14:21:50.343911] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.251 [2024-11-18 14:21:50.343965] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18012703036681091577 len:63994 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.251 [2024-11-18 14:21:50.343982] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:54.251 [2024-11-18 14:21:50.344036] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18012703036681091577 len:63994 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.251 [2024-11-18 14:21:50.344052] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:54.251 [2024-11-18 14:21:50.344105] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18012703036681091577 len:63752 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.251 [2024-11-18 14:21:50.344119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:54.251 #37 NEW cov: 12539 ft: 16213 corp: 31/1929b lim: 100 exec/s: 37 rss: 74Mb L: 83/100 MS: 1 ChangeBinInt- 00:08:54.511 [2024-11-18 14:21:50.384025] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18012703036530096633 len:63994 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.511 [2024-11-18 14:21:50.384055] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.511 [2024-11-18 14:21:50.384093] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18012703036672702969 len:63994 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.511 [2024-11-18 14:21:50.384108] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:54.511 [2024-11-18 14:21:50.384161] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18012703036681091577 len:63994 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.511 [2024-11-18 14:21:50.384175] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:54.511 [2024-11-18 14:21:50.384229] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18012703036681091577 len:63752 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.511 [2024-11-18 14:21:50.384243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:54.511 #38 NEW cov: 12546 ft: 16227 corp: 32/2012b lim: 100 exec/s: 38 rss: 74Mb L: 83/100 MS: 1 ChangeBit- 00:08:54.511 [2024-11-18 14:21:50.444205] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18012703036530096633 len:63994 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.511 [2024-11-18 14:21:50.444232] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.511 [2024-11-18 14:21:50.444303] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18012703036672702969 len:63994 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.511 [2024-11-18 14:21:50.444319] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:54.511 [2024-11-18 14:21:50.444374] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18012703036681091577 len:63994 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.511 [2024-11-18 14:21:50.444389] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:54.511 [2024-11-18 14:21:50.444445] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18012703036681091577 len:63752 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.511 [2024-11-18 14:21:50.444461] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:54.511 #39 NEW cov: 12546 ft: 16257 corp: 33/2095b lim: 100 exec/s: 39 rss: 75Mb L: 83/100 MS: 1 ChangeBinInt- 00:08:54.511 [2024-11-18 14:21:50.484281] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18012703036530096633 len:63994 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.511 [2024-11-18 14:21:50.484308] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.511 [2024-11-18 14:21:50.484363] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18012703036672702969 len:63994 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.511 [2024-11-18 14:21:50.484379] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:54.511 [2024-11-18 14:21:50.484435] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18012703036681091577 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.511 [2024-11-18 14:21:50.484451] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:54.511 [2024-11-18 14:21:50.484505] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18012703036681091577 len:63994 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.511 [2024-11-18 14:21:50.484524] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:54.511 #40 NEW cov: 12546 ft: 16266 corp: 34/2186b lim: 100 exec/s: 20 rss: 75Mb L: 91/100 MS: 1 PersAutoDict- DE: "\377\377\377\377\377\377\377\017"- 00:08:54.511 #40 DONE cov: 12546 ft: 16266 corp: 34/2186b lim: 100 exec/s: 20 rss: 75Mb 00:08:54.511 ###### Recommended dictionary. ###### 00:08:54.511 "\377\377\377\377\377\377\377\017" # Uses: 2 00:08:54.511 ###### End of recommended dictionary. ###### 00:08:54.511 Done 40 runs in 2 second(s) 00:08:54.511 14:21:50 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_24.conf /var/tmp/suppress_nvmf_fuzz 00:08:54.511 14:21:50 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:54.511 14:21:50 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:54.511 14:21:50 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@79 -- # trap - SIGINT SIGTERM EXIT 00:08:54.511 00:08:54.511 real 1m3.561s 00:08:54.511 user 1m39.182s 00:08:54.511 sys 0m8.102s 00:08:54.511 14:21:50 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1130 -- # xtrace_disable 00:08:54.511 14:21:50 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@10 -- # set +x 00:08:54.511 ************************************ 00:08:54.511 END TEST nvmf_llvm_fuzz 00:08:54.511 ************************************ 00:08:54.772 14:21:50 llvm_fuzz -- fuzz/llvm.sh@17 -- # for fuzzer in "${fuzzers[@]}" 00:08:54.772 14:21:50 llvm_fuzz -- fuzz/llvm.sh@18 -- # case "$fuzzer" in 00:08:54.772 14:21:50 llvm_fuzz -- fuzz/llvm.sh@20 -- # run_test vfio_llvm_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/run.sh 00:08:54.772 14:21:50 llvm_fuzz -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:08:54.772 14:21:50 llvm_fuzz -- common/autotest_common.sh@1111 -- # xtrace_disable 00:08:54.772 14:21:50 llvm_fuzz -- common/autotest_common.sh@10 -- # set +x 00:08:54.772 ************************************ 00:08:54.772 START TEST vfio_llvm_fuzz 00:08:54.772 ************************************ 00:08:54.772 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/run.sh 00:08:54.772 * Looking for test storage... 00:08:54.772 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:54.772 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:08:54.772 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1693 -- # lcov --version 00:08:54.772 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:08:54.772 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:08:54.772 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:08:54.772 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@333 -- # local ver1 ver1_l 00:08:54.772 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@334 -- # local ver2 ver2_l 00:08:54.772 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@336 -- # IFS=.-: 00:08:54.772 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@336 -- # read -ra ver1 00:08:54.773 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@337 -- # IFS=.-: 00:08:54.773 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@337 -- # read -ra ver2 00:08:54.773 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@338 -- # local 'op=<' 00:08:54.773 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@340 -- # ver1_l=2 00:08:54.773 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@341 -- # ver2_l=1 00:08:54.773 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:08:54.773 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@344 -- # case "$op" in 00:08:54.773 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@345 -- # : 1 00:08:54.773 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@364 -- # (( v = 0 )) 00:08:54.773 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:08:54.773 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@365 -- # decimal 1 00:08:54.773 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@353 -- # local d=1 00:08:54.773 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:08:54.773 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@355 -- # echo 1 00:08:54.773 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@365 -- # ver1[v]=1 00:08:54.773 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@366 -- # decimal 2 00:08:54.773 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@353 -- # local d=2 00:08:54.773 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:08:54.773 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@355 -- # echo 2 00:08:54.773 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@366 -- # ver2[v]=2 00:08:54.773 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:08:54.773 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:08:54.773 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@368 -- # return 0 00:08:54.773 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:08:54.773 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:08:54.773 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:54.773 --rc genhtml_branch_coverage=1 00:08:54.773 --rc genhtml_function_coverage=1 00:08:54.773 --rc genhtml_legend=1 00:08:54.773 --rc geninfo_all_blocks=1 00:08:54.773 --rc geninfo_unexecuted_blocks=1 00:08:54.773 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:54.773 ' 00:08:54.773 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:08:54.773 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:54.773 --rc genhtml_branch_coverage=1 00:08:54.773 --rc genhtml_function_coverage=1 00:08:54.773 --rc genhtml_legend=1 00:08:54.773 --rc geninfo_all_blocks=1 00:08:54.773 --rc geninfo_unexecuted_blocks=1 00:08:54.773 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:54.773 ' 00:08:54.773 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:08:54.773 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:54.773 --rc genhtml_branch_coverage=1 00:08:54.773 --rc genhtml_function_coverage=1 00:08:54.773 --rc genhtml_legend=1 00:08:54.773 --rc geninfo_all_blocks=1 00:08:54.773 --rc geninfo_unexecuted_blocks=1 00:08:54.773 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:54.773 ' 00:08:54.773 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:08:54.773 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:54.773 --rc genhtml_branch_coverage=1 00:08:54.773 --rc genhtml_function_coverage=1 00:08:54.773 --rc genhtml_legend=1 00:08:54.773 --rc geninfo_all_blocks=1 00:08:54.773 --rc geninfo_unexecuted_blocks=1 00:08:54.773 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:54.773 ' 00:08:54.773 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@64 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/common.sh 00:08:54.773 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- setup/common.sh@6 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh 00:08:54.773 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:08:54.773 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@34 -- # set -e 00:08:54.773 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:08:54.773 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@36 -- # shopt -s extglob 00:08:54.773 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@37 -- # shopt -s inherit_errexit 00:08:54.773 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@39 -- # '[' -z /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output ']' 00:08:54.773 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@44 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh ]] 00:08:54.773 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@45 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh 00:08:54.773 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:08:54.773 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:08:54.773 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=n 00:08:54.773 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:08:54.773 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:08:54.773 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:08:54.773 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:08:54.773 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:08:54.773 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:08:54.773 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:08:54.773 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:08:54.773 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:08:54.773 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:08:54.773 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:08:54.773 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:08:54.773 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:08:54.773 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@17 -- # CONFIG_MAX_NUMA_NODES=1 00:08:54.773 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@18 -- # CONFIG_PGO_CAPTURE=n 00:08:54.773 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@19 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:08:54.773 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@20 -- # CONFIG_ENV=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:08:54.773 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@21 -- # CONFIG_LTO=n 00:08:54.773 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@22 -- # CONFIG_ISCSI_INITIATOR=y 00:08:54.773 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@23 -- # CONFIG_CET=n 00:08:54.773 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@24 -- # CONFIG_VBDEV_COMPRESS_MLX5=n 00:08:54.773 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@25 -- # CONFIG_OCF_PATH= 00:08:54.773 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@26 -- # CONFIG_RDMA_SET_TOS=y 00:08:54.773 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@27 -- # CONFIG_AIO_FSDEV=y 00:08:54.773 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@28 -- # CONFIG_HAVE_ARC4RANDOM=y 00:08:54.773 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@29 -- # CONFIG_HAVE_LIBARCHIVE=n 00:08:54.773 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@30 -- # CONFIG_UBLK=y 00:08:54.773 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@31 -- # CONFIG_ISAL_CRYPTO=y 00:08:54.773 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@32 -- # CONFIG_OPENSSL_PATH= 00:08:54.773 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@33 -- # CONFIG_OCF=n 00:08:54.773 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@34 -- # CONFIG_FUSE=n 00:08:54.773 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@35 -- # CONFIG_VTUNE_DIR= 00:08:54.773 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@36 -- # CONFIG_FUZZER_LIB=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:08:54.773 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@37 -- # CONFIG_FUZZER=y 00:08:54.773 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@38 -- # CONFIG_FSDEV=y 00:08:54.773 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@39 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:08:54.773 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@40 -- # CONFIG_CRYPTO=n 00:08:54.773 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@41 -- # CONFIG_PGO_USE=n 00:08:54.773 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@42 -- # CONFIG_VHOST=y 00:08:54.773 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@43 -- # CONFIG_DAOS=n 00:08:54.773 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@44 -- # CONFIG_DPDK_INC_DIR=//var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:08:54.773 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@45 -- # CONFIG_DAOS_DIR= 00:08:54.773 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@46 -- # CONFIG_UNIT_TESTS=n 00:08:54.773 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@47 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:08:54.773 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@48 -- # CONFIG_VIRTIO=y 00:08:54.773 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@49 -- # CONFIG_DPDK_UADK=n 00:08:54.773 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@50 -- # CONFIG_COVERAGE=y 00:08:54.773 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@51 -- # CONFIG_RDMA=y 00:08:54.773 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@52 -- # CONFIG_HAVE_STRUCT_STAT_ST_ATIM=y 00:08:54.773 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@53 -- # CONFIG_HAVE_LZ4=n 00:08:54.773 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@54 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:08:54.773 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@55 -- # CONFIG_URING_PATH= 00:08:54.774 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@56 -- # CONFIG_XNVME=n 00:08:54.774 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@57 -- # CONFIG_VFIO_USER=y 00:08:54.774 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@58 -- # CONFIG_ARCH=native 00:08:54.774 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@59 -- # CONFIG_HAVE_EVP_MAC=y 00:08:54.774 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@60 -- # CONFIG_URING_ZNS=n 00:08:54.774 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@61 -- # CONFIG_WERROR=y 00:08:54.774 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@62 -- # CONFIG_HAVE_LIBBSD=n 00:08:54.774 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@63 -- # CONFIG_UBSAN=y 00:08:54.774 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@64 -- # CONFIG_HAVE_STRUCT_STAT_ST_ATIMESPEC=n 00:08:54.774 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@65 -- # CONFIG_IPSEC_MB_DIR= 00:08:54.774 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@66 -- # CONFIG_GOLANG=n 00:08:54.774 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@67 -- # CONFIG_ISAL=y 00:08:54.774 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@68 -- # CONFIG_IDXD_KERNEL=y 00:08:54.774 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@69 -- # CONFIG_DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:08:54.774 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@70 -- # CONFIG_RDMA_PROV=verbs 00:08:54.774 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@71 -- # CONFIG_APPS=y 00:08:54.774 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@72 -- # CONFIG_SHARED=n 00:08:54.774 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@73 -- # CONFIG_HAVE_KEYUTILS=y 00:08:54.774 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@74 -- # CONFIG_FC_PATH= 00:08:55.037 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@75 -- # CONFIG_DPDK_PKG_CONFIG=n 00:08:55.037 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@76 -- # CONFIG_FC=n 00:08:55.037 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@77 -- # CONFIG_AVAHI=n 00:08:55.037 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@78 -- # CONFIG_FIO_PLUGIN=y 00:08:55.037 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@79 -- # CONFIG_RAID5F=n 00:08:55.037 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@80 -- # CONFIG_EXAMPLES=y 00:08:55.037 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@81 -- # CONFIG_TESTS=y 00:08:55.037 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@82 -- # CONFIG_CRYPTO_MLX5=n 00:08:55.037 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@83 -- # CONFIG_MAX_LCORES=128 00:08:55.037 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@84 -- # CONFIG_IPSEC_MB=n 00:08:55.037 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@85 -- # CONFIG_PGO_DIR= 00:08:55.037 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@86 -- # CONFIG_DEBUG=y 00:08:55.037 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@87 -- # CONFIG_DPDK_COMPRESSDEV=n 00:08:55.037 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@88 -- # CONFIG_CROSS_PREFIX= 00:08:55.037 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@89 -- # CONFIG_COPY_FILE_RANGE=y 00:08:55.037 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@90 -- # CONFIG_URING=n 00:08:55.037 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@54 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:08:55.037 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:08:55.037 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:08:55.037 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:08:55.037 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:08:55.037 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:08:55.037 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:08:55.037 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:08:55.038 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:08:55.038 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:08:55.038 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:08:55.038 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:08:55.038 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:08:55.038 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:08:55.038 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/config.h ]] 00:08:55.038 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:08:55.038 #define SPDK_CONFIG_H 00:08:55.038 #define SPDK_CONFIG_AIO_FSDEV 1 00:08:55.038 #define SPDK_CONFIG_APPS 1 00:08:55.038 #define SPDK_CONFIG_ARCH native 00:08:55.038 #undef SPDK_CONFIG_ASAN 00:08:55.038 #undef SPDK_CONFIG_AVAHI 00:08:55.038 #undef SPDK_CONFIG_CET 00:08:55.038 #define SPDK_CONFIG_COPY_FILE_RANGE 1 00:08:55.038 #define SPDK_CONFIG_COVERAGE 1 00:08:55.038 #define SPDK_CONFIG_CROSS_PREFIX 00:08:55.038 #undef SPDK_CONFIG_CRYPTO 00:08:55.038 #undef SPDK_CONFIG_CRYPTO_MLX5 00:08:55.038 #undef SPDK_CONFIG_CUSTOMOCF 00:08:55.038 #undef SPDK_CONFIG_DAOS 00:08:55.038 #define SPDK_CONFIG_DAOS_DIR 00:08:55.038 #define SPDK_CONFIG_DEBUG 1 00:08:55.038 #undef SPDK_CONFIG_DPDK_COMPRESSDEV 00:08:55.038 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:08:55.038 #define SPDK_CONFIG_DPDK_INC_DIR //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:08:55.038 #define SPDK_CONFIG_DPDK_LIB_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:08:55.038 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:08:55.038 #undef SPDK_CONFIG_DPDK_UADK 00:08:55.038 #define SPDK_CONFIG_ENV /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:08:55.038 #define SPDK_CONFIG_EXAMPLES 1 00:08:55.038 #undef SPDK_CONFIG_FC 00:08:55.038 #define SPDK_CONFIG_FC_PATH 00:08:55.038 #define SPDK_CONFIG_FIO_PLUGIN 1 00:08:55.038 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:08:55.038 #define SPDK_CONFIG_FSDEV 1 00:08:55.038 #undef SPDK_CONFIG_FUSE 00:08:55.038 #define SPDK_CONFIG_FUZZER 1 00:08:55.038 #define SPDK_CONFIG_FUZZER_LIB /usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:08:55.038 #undef SPDK_CONFIG_GOLANG 00:08:55.038 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:08:55.038 #define SPDK_CONFIG_HAVE_EVP_MAC 1 00:08:55.038 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:08:55.038 #define SPDK_CONFIG_HAVE_KEYUTILS 1 00:08:55.038 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:08:55.038 #undef SPDK_CONFIG_HAVE_LIBBSD 00:08:55.038 #undef SPDK_CONFIG_HAVE_LZ4 00:08:55.038 #define SPDK_CONFIG_HAVE_STRUCT_STAT_ST_ATIM 1 00:08:55.038 #undef SPDK_CONFIG_HAVE_STRUCT_STAT_ST_ATIMESPEC 00:08:55.038 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:08:55.038 #define SPDK_CONFIG_IDXD 1 00:08:55.038 #define SPDK_CONFIG_IDXD_KERNEL 1 00:08:55.038 #undef SPDK_CONFIG_IPSEC_MB 00:08:55.038 #define SPDK_CONFIG_IPSEC_MB_DIR 00:08:55.038 #define SPDK_CONFIG_ISAL 1 00:08:55.038 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:08:55.038 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:08:55.038 #define SPDK_CONFIG_LIBDIR 00:08:55.038 #undef SPDK_CONFIG_LTO 00:08:55.038 #define SPDK_CONFIG_MAX_LCORES 128 00:08:55.038 #define SPDK_CONFIG_MAX_NUMA_NODES 1 00:08:55.038 #define SPDK_CONFIG_NVME_CUSE 1 00:08:55.038 #undef SPDK_CONFIG_OCF 00:08:55.038 #define SPDK_CONFIG_OCF_PATH 00:08:55.038 #define SPDK_CONFIG_OPENSSL_PATH 00:08:55.038 #undef SPDK_CONFIG_PGO_CAPTURE 00:08:55.038 #define SPDK_CONFIG_PGO_DIR 00:08:55.038 #undef SPDK_CONFIG_PGO_USE 00:08:55.038 #define SPDK_CONFIG_PREFIX /usr/local 00:08:55.038 #undef SPDK_CONFIG_RAID5F 00:08:55.038 #undef SPDK_CONFIG_RBD 00:08:55.038 #define SPDK_CONFIG_RDMA 1 00:08:55.038 #define SPDK_CONFIG_RDMA_PROV verbs 00:08:55.038 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:08:55.038 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:08:55.038 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:08:55.038 #undef SPDK_CONFIG_SHARED 00:08:55.038 #undef SPDK_CONFIG_SMA 00:08:55.038 #define SPDK_CONFIG_TESTS 1 00:08:55.038 #undef SPDK_CONFIG_TSAN 00:08:55.038 #define SPDK_CONFIG_UBLK 1 00:08:55.038 #define SPDK_CONFIG_UBSAN 1 00:08:55.038 #undef SPDK_CONFIG_UNIT_TESTS 00:08:55.038 #undef SPDK_CONFIG_URING 00:08:55.038 #define SPDK_CONFIG_URING_PATH 00:08:55.038 #undef SPDK_CONFIG_URING_ZNS 00:08:55.038 #undef SPDK_CONFIG_USDT 00:08:55.038 #undef SPDK_CONFIG_VBDEV_COMPRESS 00:08:55.038 #undef SPDK_CONFIG_VBDEV_COMPRESS_MLX5 00:08:55.038 #define SPDK_CONFIG_VFIO_USER 1 00:08:55.038 #define SPDK_CONFIG_VFIO_USER_DIR 00:08:55.038 #define SPDK_CONFIG_VHOST 1 00:08:55.038 #define SPDK_CONFIG_VIRTIO 1 00:08:55.038 #undef SPDK_CONFIG_VTUNE 00:08:55.038 #define SPDK_CONFIG_VTUNE_DIR 00:08:55.038 #define SPDK_CONFIG_WERROR 1 00:08:55.038 #define SPDK_CONFIG_WPDK_DIR 00:08:55.038 #undef SPDK_CONFIG_XNVME 00:08:55.038 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:08:55.038 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:08:55.038 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@55 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:08:55.038 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@15 -- # shopt -s extglob 00:08:55.038 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:08:55.038 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:08:55.038 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:08:55.038 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:55.038 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:55.038 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:55.038 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- paths/export.sh@5 -- # export PATH 00:08:55.038 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:55.038 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@56 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:08:55.038 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- pm/common@6 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:08:55.038 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- pm/common@6 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:08:55.038 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:08:55.038 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- pm/common@7 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/../../../ 00:08:55.038 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:08:55.038 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- pm/common@64 -- # TEST_TAG=N/A 00:08:55.038 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- pm/common@65 -- # TEST_TAG_FILE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.run_test_name 00:08:55.038 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- pm/common@67 -- # PM_OUTPUTDIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:08:55.038 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- pm/common@68 -- # uname -s 00:08:55.038 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- pm/common@68 -- # PM_OS=Linux 00:08:55.038 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- pm/common@70 -- # MONITOR_RESOURCES_SUDO=() 00:08:55.038 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- pm/common@70 -- # declare -A MONITOR_RESOURCES_SUDO 00:08:55.038 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- pm/common@71 -- # MONITOR_RESOURCES_SUDO["collect-bmc-pm"]=1 00:08:55.038 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- pm/common@72 -- # MONITOR_RESOURCES_SUDO["collect-cpu-load"]=0 00:08:55.038 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- pm/common@73 -- # MONITOR_RESOURCES_SUDO["collect-cpu-temp"]=0 00:08:55.038 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- pm/common@74 -- # MONITOR_RESOURCES_SUDO["collect-vmstat"]=0 00:08:55.038 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- pm/common@76 -- # SUDO[0]= 00:08:55.038 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- pm/common@76 -- # SUDO[1]='sudo -E' 00:08:55.038 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- pm/common@78 -- # MONITOR_RESOURCES=(collect-cpu-load collect-vmstat) 00:08:55.038 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- pm/common@79 -- # [[ Linux == FreeBSD ]] 00:08:55.038 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- pm/common@81 -- # [[ Linux == Linux ]] 00:08:55.038 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- pm/common@81 -- # [[ ............................... != QEMU ]] 00:08:55.038 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- pm/common@81 -- # [[ ! -e /.dockerenv ]] 00:08:55.038 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- pm/common@84 -- # MONITOR_RESOURCES+=(collect-cpu-temp) 00:08:55.038 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- pm/common@85 -- # MONITOR_RESOURCES+=(collect-bmc-pm) 00:08:55.038 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- pm/common@88 -- # [[ ! -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power ]] 00:08:55.038 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@58 -- # : 1 00:08:55.038 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@59 -- # export RUN_NIGHTLY 00:08:55.038 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@62 -- # : 0 00:08:55.038 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@63 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:08:55.038 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@64 -- # : 0 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@65 -- # export SPDK_RUN_VALGRIND 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@66 -- # : 1 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@67 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@68 -- # : 0 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@69 -- # export SPDK_TEST_UNITTEST 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@70 -- # : 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@71 -- # export SPDK_TEST_AUTOBUILD 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@72 -- # : 0 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@73 -- # export SPDK_TEST_RELEASE_BUILD 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@74 -- # : 0 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@75 -- # export SPDK_TEST_ISAL 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@76 -- # : 0 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@77 -- # export SPDK_TEST_ISCSI 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@78 -- # : 0 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@79 -- # export SPDK_TEST_ISCSI_INITIATOR 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@80 -- # : 0 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@82 -- # : 0 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_PMR 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@84 -- # : 0 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_BP 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@86 -- # : 0 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVME_CLI 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@88 -- # : 0 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@89 -- # export SPDK_TEST_NVME_CUSE 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@90 -- # : 0 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@91 -- # export SPDK_TEST_NVME_FDP 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@92 -- # : 0 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@93 -- # export SPDK_TEST_NVMF 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@94 -- # : 0 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@95 -- # export SPDK_TEST_VFIOUSER 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@96 -- # : 0 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@97 -- # export SPDK_TEST_VFIOUSER_QEMU 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@98 -- # : 1 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@99 -- # export SPDK_TEST_FUZZER 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@100 -- # : 1 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@101 -- # export SPDK_TEST_FUZZER_SHORT 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@102 -- # : rdma 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@103 -- # export SPDK_TEST_NVMF_TRANSPORT 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@104 -- # : 0 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@105 -- # export SPDK_TEST_RBD 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@106 -- # : 0 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@107 -- # export SPDK_TEST_VHOST 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@108 -- # : 0 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@109 -- # export SPDK_TEST_BLOCKDEV 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@110 -- # : 0 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@111 -- # export SPDK_TEST_RAID 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@112 -- # : 0 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@113 -- # export SPDK_TEST_IOAT 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@114 -- # : 0 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@115 -- # export SPDK_TEST_BLOBFS 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@116 -- # : 0 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@117 -- # export SPDK_TEST_VHOST_INIT 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@118 -- # : 0 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@119 -- # export SPDK_TEST_LVOL 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@120 -- # : 0 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@121 -- # export SPDK_TEST_VBDEV_COMPRESS 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@122 -- # : 0 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@123 -- # export SPDK_RUN_ASAN 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@124 -- # : 1 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@125 -- # export SPDK_RUN_UBSAN 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@126 -- # : /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@127 -- # export SPDK_RUN_EXTERNAL_DPDK 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@128 -- # : 0 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@129 -- # export SPDK_RUN_NON_ROOT 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@130 -- # : 0 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@131 -- # export SPDK_TEST_CRYPTO 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@132 -- # : 0 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@133 -- # export SPDK_TEST_FTL 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@134 -- # : 0 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@135 -- # export SPDK_TEST_OCF 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@136 -- # : 0 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@137 -- # export SPDK_TEST_VMD 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@138 -- # : 0 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@139 -- # export SPDK_TEST_OPAL 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@140 -- # : v23.11 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@141 -- # export SPDK_TEST_NATIVE_DPDK 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@142 -- # : true 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@143 -- # export SPDK_AUTOTEST_X 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@144 -- # : 0 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@145 -- # export SPDK_TEST_URING 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@146 -- # : 0 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@147 -- # export SPDK_TEST_USDT 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@148 -- # : 0 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@149 -- # export SPDK_TEST_USE_IGB_UIO 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@150 -- # : 0 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@151 -- # export SPDK_TEST_SCHEDULER 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@152 -- # : 0 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@153 -- # export SPDK_TEST_SCANBUILD 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@154 -- # : 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@155 -- # export SPDK_TEST_NVMF_NICS 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@156 -- # : 0 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@157 -- # export SPDK_TEST_SMA 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@158 -- # : 0 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@159 -- # export SPDK_TEST_DAOS 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@160 -- # : 0 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@161 -- # export SPDK_TEST_XNVME 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@162 -- # : 0 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@163 -- # export SPDK_TEST_ACCEL 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@164 -- # : 0 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@165 -- # export SPDK_TEST_ACCEL_DSA 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@166 -- # : 0 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@167 -- # export SPDK_TEST_ACCEL_IAA 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@169 -- # : 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@170 -- # export SPDK_TEST_FUZZER_TARGET 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@171 -- # : 0 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@172 -- # export SPDK_TEST_NVMF_MDNS 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@173 -- # : 0 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@174 -- # export SPDK_JSONRPC_GO_CLIENT 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@175 -- # : 1 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@176 -- # export SPDK_TEST_SETUP 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@177 -- # : 0 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@178 -- # export SPDK_TEST_NVME_INTERRUPT 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@181 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@181 -- # SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@182 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@182 -- # DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@183 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:55.039 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@183 -- # VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:55.040 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@184 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:55.040 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@184 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:55.040 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@187 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:08:55.040 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@187 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:08:55.040 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@191 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:08:55.040 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@191 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:08:55.040 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@195 -- # export PYTHONDONTWRITEBYTECODE=1 00:08:55.040 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@195 -- # PYTHONDONTWRITEBYTECODE=1 00:08:55.040 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@199 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:08:55.040 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@199 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:08:55.040 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@200 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:08:55.040 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@200 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:08:55.040 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@204 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:08:55.040 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@205 -- # rm -rf /var/tmp/asan_suppression_file 00:08:55.040 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@206 -- # cat 00:08:55.040 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@242 -- # echo leak:libfuse3.so 00:08:55.040 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@244 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:08:55.040 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@244 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:08:55.040 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@246 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:08:55.040 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@246 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:08:55.040 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@248 -- # '[' -z /var/spdk/dependencies ']' 00:08:55.040 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@251 -- # export DEPENDENCY_DIR 00:08:55.040 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@255 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:08:55.040 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@255 -- # SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:08:55.040 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@256 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:08:55.040 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@256 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:08:55.040 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@259 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:08:55.040 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@259 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:08:55.040 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@260 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:08:55.040 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@260 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:08:55.040 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@262 -- # export AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:08:55.040 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@262 -- # AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:08:55.040 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@265 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:08:55.040 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@265 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:08:55.040 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@267 -- # _LCOV_MAIN=0 00:08:55.040 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@268 -- # _LCOV_LLVM=1 00:08:55.040 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@269 -- # _LCOV= 00:08:55.040 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@270 -- # [[ '' == *clang* ]] 00:08:55.040 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@270 -- # [[ 1 -eq 1 ]] 00:08:55.040 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@270 -- # _LCOV=1 00:08:55.040 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@272 -- # _lcov_opt[_LCOV_LLVM]='--gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh' 00:08:55.040 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@273 -- # _lcov_opt[_LCOV_MAIN]= 00:08:55.040 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@275 -- # lcov_opt='--gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh' 00:08:55.040 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@278 -- # '[' 0 -eq 0 ']' 00:08:55.040 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@279 -- # export valgrind= 00:08:55.040 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@279 -- # valgrind= 00:08:55.040 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@285 -- # uname -s 00:08:55.040 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@285 -- # '[' Linux = Linux ']' 00:08:55.040 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@286 -- # HUGEMEM=4096 00:08:55.040 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@287 -- # export CLEAR_HUGE=yes 00:08:55.040 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@287 -- # CLEAR_HUGE=yes 00:08:55.040 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@289 -- # MAKE=make 00:08:55.040 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@290 -- # MAKEFLAGS=-j112 00:08:55.040 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@306 -- # export HUGEMEM=4096 00:08:55.040 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@306 -- # HUGEMEM=4096 00:08:55.040 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@308 -- # NO_HUGE=() 00:08:55.040 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@309 -- # TEST_MODE= 00:08:55.040 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@331 -- # [[ -z 333734 ]] 00:08:55.040 14:21:50 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@331 -- # kill -0 333734 00:08:55.040 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1678 -- # set_test_storage 2147483648 00:08:55.040 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@341 -- # [[ -v testdir ]] 00:08:55.040 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@343 -- # local requested_size=2147483648 00:08:55.040 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@344 -- # local mount target_dir 00:08:55.040 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@346 -- # local -A mounts fss sizes avails uses 00:08:55.040 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@347 -- # local source fs size avail mount use 00:08:55.040 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@349 -- # local storage_fallback storage_candidates 00:08:55.040 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@351 -- # mktemp -udt spdk.XXXXXX 00:08:55.040 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@351 -- # storage_fallback=/tmp/spdk.tZSN6H 00:08:55.040 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@356 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:08:55.040 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@358 -- # [[ -n '' ]] 00:08:55.040 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@363 -- # [[ -n '' ]] 00:08:55.040 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@368 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio /tmp/spdk.tZSN6H/tests/vfio /tmp/spdk.tZSN6H 00:08:55.040 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@371 -- # requested_size=2214592512 00:08:55.040 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:08:55.040 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@340 -- # df -T 00:08:55.040 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@340 -- # grep -v Filesystem 00:08:55.040 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=spdk_devtmpfs 00:08:55.040 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=devtmpfs 00:08:55.040 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=67108864 00:08:55.040 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=67108864 00:08:55.040 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=0 00:08:55.040 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:08:55.040 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=/dev/pmem0 00:08:55.041 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=ext2 00:08:55.041 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=4096 00:08:55.041 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=5284429824 00:08:55.041 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=5284425728 00:08:55.041 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:08:55.041 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=spdk_root 00:08:55.041 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=overlay 00:08:55.041 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=52372758528 00:08:55.041 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=61730582528 00:08:55.041 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=9357824000 00:08:55.041 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:08:55.041 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=tmpfs 00:08:55.041 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=tmpfs 00:08:55.041 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=30861860864 00:08:55.041 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=30865289216 00:08:55.041 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=3428352 00:08:55.041 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:08:55.041 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=tmpfs 00:08:55.041 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=tmpfs 00:08:55.041 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=12340117504 00:08:55.041 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=12346118144 00:08:55.041 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=6000640 00:08:55.041 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:08:55.041 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=tmpfs 00:08:55.041 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=tmpfs 00:08:55.041 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=30865113088 00:08:55.041 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=30865293312 00:08:55.041 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=180224 00:08:55.041 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:08:55.041 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=tmpfs 00:08:55.041 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=tmpfs 00:08:55.041 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=6173044736 00:08:55.041 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=6173057024 00:08:55.041 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=12288 00:08:55.041 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:08:55.041 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@379 -- # printf '* Looking for test storage...\n' 00:08:55.041 * Looking for test storage... 00:08:55.041 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@381 -- # local target_space new_size 00:08:55.041 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@382 -- # for target_dir in "${storage_candidates[@]}" 00:08:55.041 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@385 -- # df /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:55.041 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@385 -- # awk '$1 !~ /Filesystem/{print $6}' 00:08:55.041 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@385 -- # mount=/ 00:08:55.041 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@387 -- # target_space=52372758528 00:08:55.041 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@388 -- # (( target_space == 0 || target_space < requested_size )) 00:08:55.041 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@391 -- # (( target_space >= requested_size )) 00:08:55.041 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@393 -- # [[ overlay == tmpfs ]] 00:08:55.041 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@393 -- # [[ overlay == ramfs ]] 00:08:55.041 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@393 -- # [[ / == / ]] 00:08:55.041 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@394 -- # new_size=11572416512 00:08:55.041 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@395 -- # (( new_size * 100 / sizes[/] > 95 )) 00:08:55.041 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@400 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:55.041 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@400 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:55.041 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@401 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:55.041 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:55.041 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@402 -- # return 0 00:08:55.041 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1680 -- # set -o errtrace 00:08:55.041 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1681 -- # shopt -s extdebug 00:08:55.041 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1682 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:08:55.041 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1684 -- # PS4=' \t ${test_domain:-} -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:08:55.041 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1685 -- # true 00:08:55.041 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1687 -- # xtrace_fd 00:08:55.041 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@25 -- # [[ -n 14 ]] 00:08:55.041 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/14 ]] 00:08:55.041 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@27 -- # exec 00:08:55.041 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@29 -- # exec 00:08:55.041 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@31 -- # xtrace_restore 00:08:55.041 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:08:55.041 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:08:55.041 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@18 -- # set -x 00:08:55.041 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:08:55.041 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1693 -- # lcov --version 00:08:55.041 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:08:55.041 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:08:55.041 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:08:55.041 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@333 -- # local ver1 ver1_l 00:08:55.041 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@334 -- # local ver2 ver2_l 00:08:55.041 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@336 -- # IFS=.-: 00:08:55.041 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@336 -- # read -ra ver1 00:08:55.041 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@337 -- # IFS=.-: 00:08:55.041 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@337 -- # read -ra ver2 00:08:55.041 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@338 -- # local 'op=<' 00:08:55.041 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@340 -- # ver1_l=2 00:08:55.041 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@341 -- # ver2_l=1 00:08:55.041 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:08:55.041 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@344 -- # case "$op" in 00:08:55.041 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@345 -- # : 1 00:08:55.041 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@364 -- # (( v = 0 )) 00:08:55.041 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:08:55.041 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@365 -- # decimal 1 00:08:55.041 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@353 -- # local d=1 00:08:55.041 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:08:55.041 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@355 -- # echo 1 00:08:55.041 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@365 -- # ver1[v]=1 00:08:55.041 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@366 -- # decimal 2 00:08:55.041 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@353 -- # local d=2 00:08:55.041 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:08:55.041 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@355 -- # echo 2 00:08:55.041 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@366 -- # ver2[v]=2 00:08:55.041 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:08:55.041 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:08:55.041 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@368 -- # return 0 00:08:55.041 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:08:55.041 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:08:55.041 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:55.041 --rc genhtml_branch_coverage=1 00:08:55.041 --rc genhtml_function_coverage=1 00:08:55.041 --rc genhtml_legend=1 00:08:55.041 --rc geninfo_all_blocks=1 00:08:55.041 --rc geninfo_unexecuted_blocks=1 00:08:55.041 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:55.041 ' 00:08:55.041 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:08:55.041 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:55.041 --rc genhtml_branch_coverage=1 00:08:55.041 --rc genhtml_function_coverage=1 00:08:55.041 --rc genhtml_legend=1 00:08:55.041 --rc geninfo_all_blocks=1 00:08:55.041 --rc geninfo_unexecuted_blocks=1 00:08:55.041 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:55.041 ' 00:08:55.041 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:08:55.041 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:55.041 --rc genhtml_branch_coverage=1 00:08:55.041 --rc genhtml_function_coverage=1 00:08:55.041 --rc genhtml_legend=1 00:08:55.041 --rc geninfo_all_blocks=1 00:08:55.042 --rc geninfo_unexecuted_blocks=1 00:08:55.042 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:55.042 ' 00:08:55.042 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:08:55.042 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:55.042 --rc genhtml_branch_coverage=1 00:08:55.042 --rc genhtml_function_coverage=1 00:08:55.042 --rc genhtml_legend=1 00:08:55.042 --rc geninfo_all_blocks=1 00:08:55.042 --rc geninfo_unexecuted_blocks=1 00:08:55.042 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:55.042 ' 00:08:55.042 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@65 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/../common.sh 00:08:55.042 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@8 -- # pids=() 00:08:55.042 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@67 -- # fuzzfile=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c 00:08:55.042 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@68 -- # grep -c '\.fn =' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c 00:08:55.042 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@68 -- # fuzz_num=7 00:08:55.042 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@69 -- # (( fuzz_num != 0 )) 00:08:55.042 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@71 -- # trap 'cleanup /tmp/vfio-user-* /var/tmp/suppress_vfio_fuzz; exit 1' SIGINT SIGTERM EXIT 00:08:55.042 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@74 -- # mem_size=0 00:08:55.042 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@75 -- # [[ 1 -eq 1 ]] 00:08:55.042 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@76 -- # start_llvm_fuzz_short 7 1 00:08:55.042 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@69 -- # local fuzz_num=7 00:08:55.042 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@70 -- # local time=1 00:08:55.042 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i = 0 )) 00:08:55.042 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:55.042 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 0 1 0x1 00:08:55.042 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@22 -- # local fuzzer_type=0 00:08:55.042 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@23 -- # local timen=1 00:08:55.042 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@24 -- # local core=0x1 00:08:55.042 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:08:55.042 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-0 00:08:55.042 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-0/domain/1 00:08:55.042 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-0/domain/2 00:08:55.042 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-0/fuzz_vfio_json.conf 00:08:55.042 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:08:55.042 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:08:55.042 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-0 /tmp/vfio-user-0/domain/1 /tmp/vfio-user-0/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:08:55.303 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-0/domain/1%; 00:08:55.303 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-0/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:55.303 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:55.303 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:08:55.303 14:21:51 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-0/domain/1 -c /tmp/vfio-user-0/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 -Y /tmp/vfio-user-0/domain/2 -r /tmp/vfio-user-0/spdk0.sock -Z 0 00:08:55.303 [2024-11-18 14:21:51.199099] Starting SPDK v25.01-pre git sha1 d47eb51c9 / DPDK 23.11.0 initialization... 00:08:55.303 [2024-11-18 14:21:51.199174] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid333817 ] 00:08:55.303 [2024-11-18 14:21:51.293194] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:55.303 [2024-11-18 14:21:51.317825] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:55.563 INFO: Running with entropic power schedule (0xFF, 100). 00:08:55.563 INFO: Seed: 2613136159 00:08:55.563 INFO: Loaded 1 modules (384731 inline 8-bit counters): 384731 [0x2a8c64c, 0x2aea527), 00:08:55.563 INFO: Loaded 1 PC tables (384731 PCs): 384731 [0x2aea528,0x30c92d8), 00:08:55.563 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:08:55.563 INFO: A corpus is not provided, starting from an empty corpus 00:08:55.563 #2 INITED exec/s: 0 rss: 66Mb 00:08:55.563 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:55.563 This may also happen if the target rejected all inputs we tried so far 00:08:55.563 [2024-11-18 14:21:51.553315] vfio_user.c:2840:enable_ctrlr: *NOTICE*: /tmp/vfio-user-0/domain/2: enabling controller 00:08:56.084 NEW_FUNC[1/672]: 0x459068 in fuzz_vfio_user_region_rw /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:84 00:08:56.084 NEW_FUNC[2/672]: 0x45eb78 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:56.084 #17 NEW cov: 11167 ft: 10765 corp: 2/7b lim: 6 exec/s: 0 rss: 72Mb L: 6/6 MS: 5 CrossOver-CopyPart-InsertRepeatedBytes-EraseBytes-CopyPart- 00:08:56.344 #29 NEW cov: 11184 ft: 14622 corp: 3/13b lim: 6 exec/s: 0 rss: 73Mb L: 6/6 MS: 2 EraseBytes-InsertByte- 00:08:56.344 NEW_FUNC[1/1]: 0x1c1efb8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:56.344 #30 NEW cov: 11201 ft: 15542 corp: 4/19b lim: 6 exec/s: 0 rss: 74Mb L: 6/6 MS: 1 CopyPart- 00:08:56.604 #31 NEW cov: 11201 ft: 17181 corp: 5/25b lim: 6 exec/s: 31 rss: 74Mb L: 6/6 MS: 1 CopyPart- 00:08:56.865 #37 NEW cov: 11201 ft: 17722 corp: 6/31b lim: 6 exec/s: 37 rss: 74Mb L: 6/6 MS: 1 ChangeBit- 00:08:57.125 #38 NEW cov: 11201 ft: 18190 corp: 7/37b lim: 6 exec/s: 38 rss: 75Mb L: 6/6 MS: 1 ChangeBit- 00:08:57.385 #39 NEW cov: 11201 ft: 18512 corp: 8/43b lim: 6 exec/s: 39 rss: 75Mb L: 6/6 MS: 1 CopyPart- 00:08:57.386 #40 NEW cov: 11208 ft: 18603 corp: 9/49b lim: 6 exec/s: 40 rss: 75Mb L: 6/6 MS: 1 ChangeBinInt- 00:08:57.647 #41 NEW cov: 11208 ft: 18748 corp: 10/55b lim: 6 exec/s: 20 rss: 75Mb L: 6/6 MS: 1 CrossOver- 00:08:57.647 #41 DONE cov: 11208 ft: 18748 corp: 10/55b lim: 6 exec/s: 20 rss: 75Mb 00:08:57.647 Done 41 runs in 2 second(s) 00:08:57.647 [2024-11-18 14:21:53.665736] vfio_user.c:2802:disable_ctrlr: *NOTICE*: /tmp/vfio-user-0/domain/2: disabling controller 00:08:57.908 14:21:53 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-0 /var/tmp/suppress_vfio_fuzz 00:08:57.908 14:21:53 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:57.908 14:21:53 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:57.908 14:21:53 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 1 1 0x1 00:08:57.908 14:21:53 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@22 -- # local fuzzer_type=1 00:08:57.908 14:21:53 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@23 -- # local timen=1 00:08:57.908 14:21:53 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@24 -- # local core=0x1 00:08:57.908 14:21:53 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:08:57.908 14:21:53 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-1 00:08:57.908 14:21:53 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-1/domain/1 00:08:57.908 14:21:53 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-1/domain/2 00:08:57.908 14:21:53 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-1/fuzz_vfio_json.conf 00:08:57.908 14:21:53 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:08:57.908 14:21:53 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:08:57.908 14:21:53 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-1 /tmp/vfio-user-1/domain/1 /tmp/vfio-user-1/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:08:57.908 14:21:53 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-1/domain/1%; 00:08:57.908 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-1/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:57.908 14:21:53 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:57.908 14:21:53 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:08:57.908 14:21:53 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-1/domain/1 -c /tmp/vfio-user-1/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 -Y /tmp/vfio-user-1/domain/2 -r /tmp/vfio-user-1/spdk1.sock -Z 1 00:08:57.908 [2024-11-18 14:21:53.927661] Starting SPDK v25.01-pre git sha1 d47eb51c9 / DPDK 23.11.0 initialization... 00:08:57.908 [2024-11-18 14:21:53.927753] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid334351 ] 00:08:57.908 [2024-11-18 14:21:54.023501] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:58.169 [2024-11-18 14:21:54.045939] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:58.169 INFO: Running with entropic power schedule (0xFF, 100). 00:08:58.169 INFO: Seed: 1046160436 00:08:58.169 INFO: Loaded 1 modules (384731 inline 8-bit counters): 384731 [0x2a8c64c, 0x2aea527), 00:08:58.169 INFO: Loaded 1 PC tables (384731 PCs): 384731 [0x2aea528,0x30c92d8), 00:08:58.169 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:08:58.169 INFO: A corpus is not provided, starting from an empty corpus 00:08:58.169 #2 INITED exec/s: 0 rss: 67Mb 00:08:58.169 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:58.169 This may also happen if the target rejected all inputs we tried so far 00:08:58.169 [2024-11-18 14:21:54.291813] vfio_user.c:2840:enable_ctrlr: *NOTICE*: /tmp/vfio-user-1/domain/2: enabling controller 00:08:58.429 [2024-11-18 14:21:54.363868] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:58.429 [2024-11-18 14:21:54.363891] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:58.429 [2024-11-18 14:21:54.363929] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:58.690 NEW_FUNC[1/674]: 0x459608 in fuzz_vfio_user_version /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:71 00:08:58.690 NEW_FUNC[2/674]: 0x45eb78 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:58.690 #25 NEW cov: 11162 ft: 11017 corp: 2/5b lim: 4 exec/s: 0 rss: 73Mb L: 4/4 MS: 3 CMP-CrossOver-InsertByte- DE: "\000\003"- 00:08:58.951 [2024-11-18 14:21:54.855100] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:58.951 [2024-11-18 14:21:54.855131] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:58.951 [2024-11-18 14:21:54.855149] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:58.951 #26 NEW cov: 11176 ft: 14751 corp: 3/9b lim: 4 exec/s: 0 rss: 74Mb L: 4/4 MS: 1 ChangeBinInt- 00:08:58.951 [2024-11-18 14:21:55.042126] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:58.951 [2024-11-18 14:21:55.042149] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:58.951 [2024-11-18 14:21:55.042167] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:59.211 NEW_FUNC[1/1]: 0x1c1efb8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:59.212 #27 NEW cov: 11193 ft: 15554 corp: 4/13b lim: 4 exec/s: 0 rss: 75Mb L: 4/4 MS: 1 ChangeBinInt- 00:08:59.212 [2024-11-18 14:21:55.234716] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:59.212 [2024-11-18 14:21:55.234739] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:59.212 [2024-11-18 14:21:55.234756] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:59.472 #28 NEW cov: 11193 ft: 16264 corp: 5/17b lim: 4 exec/s: 28 rss: 75Mb L: 4/4 MS: 1 ChangeBit- 00:08:59.472 [2024-11-18 14:21:55.424939] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:59.472 [2024-11-18 14:21:55.424961] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:59.472 [2024-11-18 14:21:55.424978] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:59.472 #29 NEW cov: 11193 ft: 16813 corp: 6/21b lim: 4 exec/s: 29 rss: 75Mb L: 4/4 MS: 1 ChangeBit- 00:08:59.732 [2024-11-18 14:21:55.623888] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:59.732 [2024-11-18 14:21:55.623908] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:59.732 [2024-11-18 14:21:55.623925] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:59.732 #30 NEW cov: 11193 ft: 17624 corp: 7/25b lim: 4 exec/s: 30 rss: 75Mb L: 4/4 MS: 1 CopyPart- 00:08:59.732 [2024-11-18 14:21:55.833753] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:59.732 [2024-11-18 14:21:55.833775] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:59.732 [2024-11-18 14:21:55.833793] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:59.992 #31 NEW cov: 11193 ft: 18015 corp: 8/29b lim: 4 exec/s: 31 rss: 75Mb L: 4/4 MS: 1 PersAutoDict- DE: "\000\003"- 00:08:59.992 [2024-11-18 14:21:56.025396] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:59.992 [2024-11-18 14:21:56.025418] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:59.992 [2024-11-18 14:21:56.025436] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:09:00.253 #37 NEW cov: 11200 ft: 18296 corp: 9/33b lim: 4 exec/s: 37 rss: 76Mb L: 4/4 MS: 1 ChangeBinInt- 00:09:00.253 [2024-11-18 14:21:56.221616] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:09:00.253 [2024-11-18 14:21:56.221638] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:09:00.253 [2024-11-18 14:21:56.221655] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:09:00.253 #38 NEW cov: 11200 ft: 18438 corp: 10/37b lim: 4 exec/s: 19 rss: 76Mb L: 4/4 MS: 1 CrossOver- 00:09:00.253 #38 DONE cov: 11200 ft: 18438 corp: 10/37b lim: 4 exec/s: 19 rss: 76Mb 00:09:00.253 ###### Recommended dictionary. ###### 00:09:00.253 "\000\003" # Uses: 2 00:09:00.253 ###### End of recommended dictionary. ###### 00:09:00.253 Done 38 runs in 2 second(s) 00:09:00.253 [2024-11-18 14:21:56.357723] vfio_user.c:2802:disable_ctrlr: *NOTICE*: /tmp/vfio-user-1/domain/2: disabling controller 00:09:00.514 14:21:56 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-1 /var/tmp/suppress_vfio_fuzz 00:09:00.514 14:21:56 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:09:00.514 14:21:56 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:00.514 14:21:56 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 2 1 0x1 00:09:00.514 14:21:56 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@22 -- # local fuzzer_type=2 00:09:00.514 14:21:56 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@23 -- # local timen=1 00:09:00.514 14:21:56 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@24 -- # local core=0x1 00:09:00.514 14:21:56 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:09:00.514 14:21:56 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-2 00:09:00.514 14:21:56 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-2/domain/1 00:09:00.514 14:21:56 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-2/domain/2 00:09:00.514 14:21:56 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-2/fuzz_vfio_json.conf 00:09:00.514 14:21:56 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:09:00.514 14:21:56 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:09:00.514 14:21:56 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-2 /tmp/vfio-user-2/domain/1 /tmp/vfio-user-2/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:09:00.514 14:21:56 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-2/domain/1%; 00:09:00.514 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-2/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:09:00.514 14:21:56 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:09:00.514 14:21:56 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:09:00.514 14:21:56 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-2/domain/1 -c /tmp/vfio-user-2/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 -Y /tmp/vfio-user-2/domain/2 -r /tmp/vfio-user-2/spdk2.sock -Z 2 00:09:00.514 [2024-11-18 14:21:56.617562] Starting SPDK v25.01-pre git sha1 d47eb51c9 / DPDK 23.11.0 initialization... 00:09:00.514 [2024-11-18 14:21:56.617643] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid334674 ] 00:09:00.776 [2024-11-18 14:21:56.715434] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:00.776 [2024-11-18 14:21:56.737858] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:09:01.036 INFO: Running with entropic power schedule (0xFF, 100). 00:09:01.036 INFO: Seed: 3747159787 00:09:01.036 INFO: Loaded 1 modules (384731 inline 8-bit counters): 384731 [0x2a8c64c, 0x2aea527), 00:09:01.036 INFO: Loaded 1 PC tables (384731 PCs): 384731 [0x2aea528,0x30c92d8), 00:09:01.036 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:09:01.036 INFO: A corpus is not provided, starting from an empty corpus 00:09:01.036 #2 INITED exec/s: 0 rss: 66Mb 00:09:01.036 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:01.036 This may also happen if the target rejected all inputs we tried so far 00:09:01.036 [2024-11-18 14:21:56.992910] vfio_user.c:2840:enable_ctrlr: *NOTICE*: /tmp/vfio-user-2/domain/2: enabling controller 00:09:01.036 [2024-11-18 14:21:57.057606] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:09:01.557 NEW_FUNC[1/673]: 0x459ff8 in fuzz_vfio_user_get_region_info /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:103 00:09:01.557 NEW_FUNC[2/673]: 0x45eb78 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:09:01.557 #22 NEW cov: 11149 ft: 11064 corp: 2/9b lim: 8 exec/s: 0 rss: 72Mb L: 8/8 MS: 5 CopyPart-InsertByte-ChangeByte-CrossOver-InsertRepeatedBytes- 00:09:01.557 [2024-11-18 14:21:57.540790] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:09:01.557 #23 NEW cov: 11163 ft: 14700 corp: 3/17b lim: 8 exec/s: 0 rss: 73Mb L: 8/8 MS: 1 ChangeByte- 00:09:01.817 [2024-11-18 14:21:57.749330] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-2/domain/1: msg0: no payload for cmd5 00:09:01.817 [2024-11-18 14:21:57.749365] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 5 return failure 00:09:01.817 NEW_FUNC[1/2]: 0x159ef18 in vfio_user_log /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/vfio_user.c:3098 00:09:01.817 NEW_FUNC[2/2]: 0x1c1efb8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:09:01.817 #29 NEW cov: 11190 ft: 15349 corp: 4/25b lim: 8 exec/s: 0 rss: 74Mb L: 8/8 MS: 1 CMP- DE: "\010\000\000\000\000\000\000\000"- 00:09:02.077 [2024-11-18 14:21:57.953297] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:09:02.077 #30 NEW cov: 11190 ft: 15738 corp: 5/33b lim: 8 exec/s: 30 rss: 74Mb L: 8/8 MS: 1 ChangeBit- 00:09:02.077 [2024-11-18 14:21:58.147669] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:09:02.338 #36 NEW cov: 11190 ft: 15894 corp: 6/41b lim: 8 exec/s: 36 rss: 74Mb L: 8/8 MS: 1 CrossOver- 00:09:02.338 [2024-11-18 14:21:58.338460] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-2/domain/1: msg0: no payload for cmd5 00:09:02.338 [2024-11-18 14:21:58.338490] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 5 return failure 00:09:02.338 #37 NEW cov: 11190 ft: 16244 corp: 7/49b lim: 8 exec/s: 37 rss: 74Mb L: 8/8 MS: 1 CrossOver- 00:09:02.599 [2024-11-18 14:21:58.548806] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:09:02.599 #43 NEW cov: 11190 ft: 16570 corp: 8/57b lim: 8 exec/s: 43 rss: 74Mb L: 8/8 MS: 1 CMP- DE: "\354\321m\020\000 \000\000"- 00:09:02.859 [2024-11-18 14:21:58.736711] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:09:02.859 #44 NEW cov: 11197 ft: 17347 corp: 9/65b lim: 8 exec/s: 44 rss: 74Mb L: 8/8 MS: 1 CrossOver- 00:09:02.859 [2024-11-18 14:21:58.941243] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:09:03.120 #45 NEW cov: 11197 ft: 17422 corp: 10/73b lim: 8 exec/s: 22 rss: 75Mb L: 8/8 MS: 1 CrossOver- 00:09:03.120 #45 DONE cov: 11197 ft: 17422 corp: 10/73b lim: 8 exec/s: 22 rss: 75Mb 00:09:03.120 ###### Recommended dictionary. ###### 00:09:03.120 "\010\000\000\000\000\000\000\000" # Uses: 0 00:09:03.120 "\354\321m\020\000 \000\000" # Uses: 0 00:09:03.120 ###### End of recommended dictionary. ###### 00:09:03.120 Done 45 runs in 2 second(s) 00:09:03.120 [2024-11-18 14:21:59.074738] vfio_user.c:2802:disable_ctrlr: *NOTICE*: /tmp/vfio-user-2/domain/2: disabling controller 00:09:03.380 14:21:59 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-2 /var/tmp/suppress_vfio_fuzz 00:09:03.380 14:21:59 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:09:03.380 14:21:59 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:03.380 14:21:59 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 3 1 0x1 00:09:03.380 14:21:59 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@22 -- # local fuzzer_type=3 00:09:03.380 14:21:59 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@23 -- # local timen=1 00:09:03.380 14:21:59 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@24 -- # local core=0x1 00:09:03.380 14:21:59 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:09:03.380 14:21:59 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-3 00:09:03.380 14:21:59 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-3/domain/1 00:09:03.380 14:21:59 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-3/domain/2 00:09:03.380 14:21:59 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-3/fuzz_vfio_json.conf 00:09:03.380 14:21:59 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:09:03.380 14:21:59 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:09:03.380 14:21:59 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-3 /tmp/vfio-user-3/domain/1 /tmp/vfio-user-3/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:09:03.380 14:21:59 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-3/domain/1%; 00:09:03.380 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-3/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:09:03.380 14:21:59 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:09:03.380 14:21:59 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:09:03.380 14:21:59 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-3/domain/1 -c /tmp/vfio-user-3/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 -Y /tmp/vfio-user-3/domain/2 -r /tmp/vfio-user-3/spdk3.sock -Z 3 00:09:03.380 [2024-11-18 14:21:59.334892] Starting SPDK v25.01-pre git sha1 d47eb51c9 / DPDK 23.11.0 initialization... 00:09:03.380 [2024-11-18 14:21:59.334958] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid335179 ] 00:09:03.380 [2024-11-18 14:21:59.428012] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:03.380 [2024-11-18 14:21:59.450332] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:09:03.640 INFO: Running with entropic power schedule (0xFF, 100). 00:09:03.640 INFO: Seed: 2161200779 00:09:03.640 INFO: Loaded 1 modules (384731 inline 8-bit counters): 384731 [0x2a8c64c, 0x2aea527), 00:09:03.640 INFO: Loaded 1 PC tables (384731 PCs): 384731 [0x2aea528,0x30c92d8), 00:09:03.640 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:09:03.641 INFO: A corpus is not provided, starting from an empty corpus 00:09:03.641 #2 INITED exec/s: 0 rss: 68Mb 00:09:03.641 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:03.641 This may also happen if the target rejected all inputs we tried so far 00:09:03.641 [2024-11-18 14:21:59.691373] vfio_user.c:2840:enable_ctrlr: *NOTICE*: /tmp/vfio-user-3/domain/2: enabling controller 00:09:04.161 NEW_FUNC[1/673]: 0x45a6e8 in fuzz_vfio_user_dma_map /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:124 00:09:04.161 NEW_FUNC[2/673]: 0x45eb78 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:09:04.161 #111 NEW cov: 11157 ft: 11069 corp: 2/33b lim: 32 exec/s: 0 rss: 73Mb L: 32/32 MS: 4 CrossOver-ShuffleBytes-CrossOver-InsertRepeatedBytes- 00:09:04.422 #117 NEW cov: 11171 ft: 14005 corp: 3/65b lim: 32 exec/s: 0 rss: 74Mb L: 32/32 MS: 1 ShuffleBytes- 00:09:04.682 NEW_FUNC[1/1]: 0x1c1efb8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:09:04.682 #118 NEW cov: 11188 ft: 14776 corp: 4/97b lim: 32 exec/s: 0 rss: 75Mb L: 32/32 MS: 1 CopyPart- 00:09:04.682 #119 NEW cov: 11188 ft: 16158 corp: 5/129b lim: 32 exec/s: 119 rss: 75Mb L: 32/32 MS: 1 CrossOver- 00:09:04.943 #120 NEW cov: 11188 ft: 16250 corp: 6/161b lim: 32 exec/s: 120 rss: 75Mb L: 32/32 MS: 1 ChangeBinInt- 00:09:05.204 #131 NEW cov: 11188 ft: 16619 corp: 7/193b lim: 32 exec/s: 131 rss: 75Mb L: 32/32 MS: 1 CrossOver- 00:09:05.463 #132 NEW cov: 11188 ft: 17228 corp: 8/225b lim: 32 exec/s: 132 rss: 75Mb L: 32/32 MS: 1 ChangeBit- 00:09:05.463 #137 NEW cov: 11195 ft: 17531 corp: 9/257b lim: 32 exec/s: 137 rss: 76Mb L: 32/32 MS: 5 EraseBytes-CopyPart-ShuffleBytes-InsertByte-InsertRepeatedBytes- 00:09:05.723 #138 NEW cov: 11195 ft: 17607 corp: 10/289b lim: 32 exec/s: 69 rss: 76Mb L: 32/32 MS: 1 ChangeBinInt- 00:09:05.723 #138 DONE cov: 11195 ft: 17607 corp: 10/289b lim: 32 exec/s: 69 rss: 76Mb 00:09:05.723 Done 138 runs in 2 second(s) 00:09:05.723 [2024-11-18 14:22:01.738877] vfio_user.c:2802:disable_ctrlr: *NOTICE*: /tmp/vfio-user-3/domain/2: disabling controller 00:09:05.983 14:22:01 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-3 /var/tmp/suppress_vfio_fuzz 00:09:05.983 14:22:01 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:09:05.983 14:22:01 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:05.983 14:22:01 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 4 1 0x1 00:09:05.983 14:22:01 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@22 -- # local fuzzer_type=4 00:09:05.983 14:22:01 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@23 -- # local timen=1 00:09:05.983 14:22:01 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@24 -- # local core=0x1 00:09:05.983 14:22:01 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:09:05.983 14:22:01 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-4 00:09:05.983 14:22:01 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-4/domain/1 00:09:05.983 14:22:01 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-4/domain/2 00:09:05.983 14:22:01 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-4/fuzz_vfio_json.conf 00:09:05.983 14:22:01 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:09:05.983 14:22:01 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:09:05.983 14:22:01 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-4 /tmp/vfio-user-4/domain/1 /tmp/vfio-user-4/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:09:05.983 14:22:01 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-4/domain/1%; 00:09:05.983 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-4/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:09:05.983 14:22:01 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:09:05.983 14:22:01 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:09:05.983 14:22:01 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-4/domain/1 -c /tmp/vfio-user-4/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 -Y /tmp/vfio-user-4/domain/2 -r /tmp/vfio-user-4/spdk4.sock -Z 4 00:09:05.983 [2024-11-18 14:22:01.998976] Starting SPDK v25.01-pre git sha1 d47eb51c9 / DPDK 23.11.0 initialization... 00:09:05.983 [2024-11-18 14:22:01.999050] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid335714 ] 00:09:05.983 [2024-11-18 14:22:02.091246] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:06.243 [2024-11-18 14:22:02.113742] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:09:06.243 INFO: Running with entropic power schedule (0xFF, 100). 00:09:06.243 INFO: Seed: 524227096 00:09:06.243 INFO: Loaded 1 modules (384731 inline 8-bit counters): 384731 [0x2a8c64c, 0x2aea527), 00:09:06.243 INFO: Loaded 1 PC tables (384731 PCs): 384731 [0x2aea528,0x30c92d8), 00:09:06.243 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:09:06.243 INFO: A corpus is not provided, starting from an empty corpus 00:09:06.243 #2 INITED exec/s: 0 rss: 67Mb 00:09:06.243 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:06.243 This may also happen if the target rejected all inputs we tried so far 00:09:06.243 [2024-11-18 14:22:02.350015] vfio_user.c:2840:enable_ctrlr: *NOTICE*: /tmp/vfio-user-4/domain/2: enabling controller 00:09:06.858 NEW_FUNC[1/672]: 0x45af68 in fuzz_vfio_user_dma_unmap /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:144 00:09:06.858 NEW_FUNC[2/672]: 0x45eb78 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:09:06.858 #22 NEW cov: 11144 ft: 10829 corp: 2/33b lim: 32 exec/s: 0 rss: 72Mb L: 32/32 MS: 5 InsertByte-EraseBytes-CrossOver-InsertRepeatedBytes-InsertRepeatedBytes- 00:09:06.858 NEW_FUNC[1/1]: 0x1c19658 in reactor_post_process_lw_thread /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:932 00:09:06.858 #28 NEW cov: 11173 ft: 14380 corp: 3/65b lim: 32 exec/s: 0 rss: 73Mb L: 32/32 MS: 1 ChangeByte- 00:09:07.124 #29 NEW cov: 11173 ft: 15435 corp: 4/97b lim: 32 exec/s: 0 rss: 74Mb L: 32/32 MS: 1 CopyPart- 00:09:07.124 NEW_FUNC[1/1]: 0x1c1efb8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:09:07.124 #30 NEW cov: 11190 ft: 15747 corp: 5/129b lim: 32 exec/s: 0 rss: 74Mb L: 32/32 MS: 1 ShuffleBytes- 00:09:07.124 #31 NEW cov: 11190 ft: 15996 corp: 6/161b lim: 32 exec/s: 0 rss: 74Mb L: 32/32 MS: 1 CopyPart- 00:09:07.408 #32 NEW cov: 11190 ft: 16649 corp: 7/193b lim: 32 exec/s: 32 rss: 74Mb L: 32/32 MS: 1 ChangeByte- 00:09:07.408 #33 NEW cov: 11190 ft: 16743 corp: 8/225b lim: 32 exec/s: 33 rss: 74Mb L: 32/32 MS: 1 ChangeBinInt- 00:09:07.695 #34 NEW cov: 11190 ft: 16854 corp: 9/257b lim: 32 exec/s: 34 rss: 75Mb L: 32/32 MS: 1 ChangeBinInt- 00:09:07.695 #35 NEW cov: 11190 ft: 16952 corp: 10/289b lim: 32 exec/s: 35 rss: 75Mb L: 32/32 MS: 1 ChangeByte- 00:09:07.969 #36 NEW cov: 11190 ft: 17437 corp: 11/321b lim: 32 exec/s: 36 rss: 75Mb L: 32/32 MS: 1 ChangeByte- 00:09:07.969 #42 NEW cov: 11190 ft: 17474 corp: 12/353b lim: 32 exec/s: 42 rss: 75Mb L: 32/32 MS: 1 ChangeBinInt- 00:09:07.969 #53 NEW cov: 11190 ft: 17700 corp: 13/385b lim: 32 exec/s: 53 rss: 75Mb L: 32/32 MS: 1 CopyPart- 00:09:08.252 #54 NEW cov: 11197 ft: 17764 corp: 14/417b lim: 32 exec/s: 54 rss: 75Mb L: 32/32 MS: 1 ChangeBit- 00:09:08.252 #55 NEW cov: 11197 ft: 17818 corp: 15/449b lim: 32 exec/s: 27 rss: 75Mb L: 32/32 MS: 1 CrossOver- 00:09:08.252 #55 DONE cov: 11197 ft: 17818 corp: 15/449b lim: 32 exec/s: 27 rss: 75Mb 00:09:08.252 Done 55 runs in 2 second(s) 00:09:08.252 [2024-11-18 14:22:04.338768] vfio_user.c:2802:disable_ctrlr: *NOTICE*: /tmp/vfio-user-4/domain/2: disabling controller 00:09:08.566 14:22:04 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-4 /var/tmp/suppress_vfio_fuzz 00:09:08.566 14:22:04 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:09:08.566 14:22:04 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:08.566 14:22:04 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 5 1 0x1 00:09:08.566 14:22:04 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@22 -- # local fuzzer_type=5 00:09:08.566 14:22:04 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@23 -- # local timen=1 00:09:08.566 14:22:04 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@24 -- # local core=0x1 00:09:08.566 14:22:04 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:09:08.566 14:22:04 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-5 00:09:08.566 14:22:04 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-5/domain/1 00:09:08.566 14:22:04 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-5/domain/2 00:09:08.566 14:22:04 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-5/fuzz_vfio_json.conf 00:09:08.566 14:22:04 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:09:08.566 14:22:04 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:09:08.566 14:22:04 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-5 /tmp/vfio-user-5/domain/1 /tmp/vfio-user-5/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:09:08.566 14:22:04 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-5/domain/1%; 00:09:08.566 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-5/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:09:08.566 14:22:04 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:09:08.566 14:22:04 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:09:08.566 14:22:04 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-5/domain/1 -c /tmp/vfio-user-5/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 -Y /tmp/vfio-user-5/domain/2 -r /tmp/vfio-user-5/spdk5.sock -Z 5 00:09:08.566 [2024-11-18 14:22:04.598413] Starting SPDK v25.01-pre git sha1 d47eb51c9 / DPDK 23.11.0 initialization... 00:09:08.566 [2024-11-18 14:22:04.598487] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid336144 ] 00:09:08.836 [2024-11-18 14:22:04.691716] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:08.836 [2024-11-18 14:22:04.714916] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:09:08.836 INFO: Running with entropic power schedule (0xFF, 100). 00:09:08.836 INFO: Seed: 3129248091 00:09:08.836 INFO: Loaded 1 modules (384731 inline 8-bit counters): 384731 [0x2a8c64c, 0x2aea527), 00:09:08.836 INFO: Loaded 1 PC tables (384731 PCs): 384731 [0x2aea528,0x30c92d8), 00:09:08.836 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:09:08.836 INFO: A corpus is not provided, starting from an empty corpus 00:09:08.836 #2 INITED exec/s: 0 rss: 66Mb 00:09:08.836 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:08.836 This may also happen if the target rejected all inputs we tried so far 00:09:08.836 [2024-11-18 14:22:04.954258] vfio_user.c:2840:enable_ctrlr: *NOTICE*: /tmp/vfio-user-5/domain/2: enabling controller 00:09:09.117 [2024-11-18 14:22:05.026571] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:09.117 [2024-11-18 14:22:05.026609] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:09.402 NEW_FUNC[1/674]: 0x45b968 in fuzz_vfio_user_irq_set /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:171 00:09:09.402 NEW_FUNC[2/674]: 0x45eb78 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:09:09.402 #49 NEW cov: 11160 ft: 11069 corp: 2/14b lim: 13 exec/s: 0 rss: 72Mb L: 13/13 MS: 2 CrossOver-InsertRepeatedBytes- 00:09:09.402 [2024-11-18 14:22:05.522448] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:09.402 [2024-11-18 14:22:05.522488] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:09.696 #55 NEW cov: 11182 ft: 14140 corp: 3/27b lim: 13 exec/s: 0 rss: 73Mb L: 13/13 MS: 1 CMP- DE: "\000\000\000\000\000\000\000y"- 00:09:09.696 [2024-11-18 14:22:05.723196] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:09.696 [2024-11-18 14:22:05.723227] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:09.989 NEW_FUNC[1/1]: 0x1c1efb8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:09:09.989 #56 NEW cov: 11199 ft: 15141 corp: 4/40b lim: 13 exec/s: 0 rss: 74Mb L: 13/13 MS: 1 ChangeBit- 00:09:09.989 [2024-11-18 14:22:05.924045] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:09.989 [2024-11-18 14:22:05.924077] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:09.989 #57 NEW cov: 11199 ft: 16259 corp: 5/53b lim: 13 exec/s: 57 rss: 74Mb L: 13/13 MS: 1 ChangeByte- 00:09:09.989 [2024-11-18 14:22:06.109090] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:09.989 [2024-11-18 14:22:06.109126] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:10.262 #58 NEW cov: 11199 ft: 16712 corp: 6/66b lim: 13 exec/s: 58 rss: 74Mb L: 13/13 MS: 1 ChangeBit- 00:09:10.262 [2024-11-18 14:22:06.296508] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:10.262 [2024-11-18 14:22:06.296539] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:10.544 #64 NEW cov: 11199 ft: 17003 corp: 7/79b lim: 13 exec/s: 64 rss: 74Mb L: 13/13 MS: 1 ChangeBit- 00:09:10.544 [2024-11-18 14:22:06.481533] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:10.544 [2024-11-18 14:22:06.481570] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:10.544 #65 NEW cov: 11199 ft: 17084 corp: 8/92b lim: 13 exec/s: 65 rss: 74Mb L: 13/13 MS: 1 ShuffleBytes- 00:09:10.834 [2024-11-18 14:22:06.672075] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:10.834 [2024-11-18 14:22:06.672106] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:10.834 #66 NEW cov: 11206 ft: 17197 corp: 9/105b lim: 13 exec/s: 66 rss: 74Mb L: 13/13 MS: 1 ShuffleBytes- 00:09:10.834 [2024-11-18 14:22:06.859097] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:10.834 [2024-11-18 14:22:06.859129] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:11.103 #67 NEW cov: 11206 ft: 17223 corp: 10/118b lim: 13 exec/s: 33 rss: 74Mb L: 13/13 MS: 1 ChangeByte- 00:09:11.103 #67 DONE cov: 11206 ft: 17223 corp: 10/118b lim: 13 exec/s: 33 rss: 74Mb 00:09:11.103 ###### Recommended dictionary. ###### 00:09:11.103 "\000\000\000\000\000\000\000y" # Uses: 0 00:09:11.103 ###### End of recommended dictionary. ###### 00:09:11.103 Done 67 runs in 2 second(s) 00:09:11.103 [2024-11-18 14:22:06.990727] vfio_user.c:2802:disable_ctrlr: *NOTICE*: /tmp/vfio-user-5/domain/2: disabling controller 00:09:11.103 14:22:07 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-5 /var/tmp/suppress_vfio_fuzz 00:09:11.103 14:22:07 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:09:11.103 14:22:07 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:11.103 14:22:07 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 6 1 0x1 00:09:11.103 14:22:07 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@22 -- # local fuzzer_type=6 00:09:11.103 14:22:07 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@23 -- # local timen=1 00:09:11.103 14:22:07 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@24 -- # local core=0x1 00:09:11.103 14:22:07 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:09:11.103 14:22:07 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-6 00:09:11.103 14:22:07 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-6/domain/1 00:09:11.103 14:22:07 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-6/domain/2 00:09:11.103 14:22:07 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-6/fuzz_vfio_json.conf 00:09:11.103 14:22:07 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:09:11.103 14:22:07 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:09:11.103 14:22:07 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-6 /tmp/vfio-user-6/domain/1 /tmp/vfio-user-6/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:09:11.103 14:22:07 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-6/domain/1%; 00:09:11.103 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-6/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:09:11.103 14:22:07 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:09:11.103 14:22:07 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:09:11.103 14:22:07 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-6/domain/1 -c /tmp/vfio-user-6/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 -Y /tmp/vfio-user-6/domain/2 -r /tmp/vfio-user-6/spdk6.sock -Z 6 00:09:11.385 [2024-11-18 14:22:07.248147] Starting SPDK v25.01-pre git sha1 d47eb51c9 / DPDK 23.11.0 initialization... 00:09:11.385 [2024-11-18 14:22:07.248213] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid336558 ] 00:09:11.385 [2024-11-18 14:22:07.342520] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:11.385 [2024-11-18 14:22:07.364861] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:09:11.661 INFO: Running with entropic power schedule (0xFF, 100). 00:09:11.661 INFO: Seed: 1490269574 00:09:11.661 INFO: Loaded 1 modules (384731 inline 8-bit counters): 384731 [0x2a8c64c, 0x2aea527), 00:09:11.661 INFO: Loaded 1 PC tables (384731 PCs): 384731 [0x2aea528,0x30c92d8), 00:09:11.661 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:09:11.661 INFO: A corpus is not provided, starting from an empty corpus 00:09:11.661 #2 INITED exec/s: 0 rss: 67Mb 00:09:11.661 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:11.661 This may also happen if the target rejected all inputs we tried so far 00:09:11.661 [2024-11-18 14:22:07.620708] vfio_user.c:2840:enable_ctrlr: *NOTICE*: /tmp/vfio-user-6/domain/2: enabling controller 00:09:11.661 [2024-11-18 14:22:07.682584] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:11.661 [2024-11-18 14:22:07.682616] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:12.221 NEW_FUNC[1/674]: 0x45c658 in fuzz_vfio_user_set_msix /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:190 00:09:12.221 NEW_FUNC[2/674]: 0x45eb78 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:09:12.221 #27 NEW cov: 11152 ft: 11067 corp: 2/10b lim: 9 exec/s: 0 rss: 73Mb L: 9/9 MS: 5 ShuffleBytes-InsertRepeatedBytes-CrossOver-InsertByte-CopyPart- 00:09:12.221 [2024-11-18 14:22:08.180596] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:12.221 [2024-11-18 14:22:08.180637] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:12.221 #29 NEW cov: 11174 ft: 14432 corp: 3/19b lim: 9 exec/s: 0 rss: 74Mb L: 9/9 MS: 2 ChangeBit-CrossOver- 00:09:12.480 [2024-11-18 14:22:08.387712] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:12.480 [2024-11-18 14:22:08.387743] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:12.481 NEW_FUNC[1/1]: 0x1c1efb8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:09:12.481 #36 NEW cov: 11191 ft: 16064 corp: 4/28b lim: 9 exec/s: 0 rss: 75Mb L: 9/9 MS: 2 CrossOver-InsertRepeatedBytes- 00:09:12.481 [2024-11-18 14:22:08.596907] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:12.481 [2024-11-18 14:22:08.596937] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:12.740 #37 NEW cov: 11191 ft: 16286 corp: 5/37b lim: 9 exec/s: 37 rss: 75Mb L: 9/9 MS: 1 ChangeBit- 00:09:12.740 [2024-11-18 14:22:08.798185] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:12.740 [2024-11-18 14:22:08.798214] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:13.000 #43 NEW cov: 11191 ft: 17121 corp: 6/46b lim: 9 exec/s: 43 rss: 75Mb L: 9/9 MS: 1 ChangeByte- 00:09:13.000 [2024-11-18 14:22:09.012461] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:13.000 [2024-11-18 14:22:09.012491] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:13.259 #44 NEW cov: 11191 ft: 17576 corp: 7/55b lim: 9 exec/s: 44 rss: 75Mb L: 9/9 MS: 1 ChangeBinInt- 00:09:13.259 [2024-11-18 14:22:09.224528] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:13.259 [2024-11-18 14:22:09.224569] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:13.259 #47 NEW cov: 11191 ft: 17779 corp: 8/64b lim: 9 exec/s: 47 rss: 75Mb L: 9/9 MS: 3 ChangeBit-CMP-CopyPart- DE: "\001\000\000\000"- 00:09:13.518 [2024-11-18 14:22:09.439295] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:13.518 [2024-11-18 14:22:09.439327] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:13.518 #53 NEW cov: 11198 ft: 17814 corp: 9/73b lim: 9 exec/s: 53 rss: 75Mb L: 9/9 MS: 1 CopyPart- 00:09:13.518 [2024-11-18 14:22:09.635415] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:13.518 [2024-11-18 14:22:09.635443] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:13.857 #58 NEW cov: 11198 ft: 17913 corp: 10/82b lim: 9 exec/s: 29 rss: 75Mb L: 9/9 MS: 5 CopyPart-ChangeByte-InsertByte-CopyPart-InsertRepeatedBytes- 00:09:13.857 #58 DONE cov: 11198 ft: 17913 corp: 10/82b lim: 9 exec/s: 29 rss: 75Mb 00:09:13.857 ###### Recommended dictionary. ###### 00:09:13.857 "\001\000\000\000" # Uses: 1 00:09:13.857 ###### End of recommended dictionary. ###### 00:09:13.857 Done 58 runs in 2 second(s) 00:09:13.857 [2024-11-18 14:22:09.769742] vfio_user.c:2802:disable_ctrlr: *NOTICE*: /tmp/vfio-user-6/domain/2: disabling controller 00:09:14.117 14:22:09 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-6 /var/tmp/suppress_vfio_fuzz 00:09:14.117 14:22:09 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:09:14.117 14:22:09 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:14.117 14:22:09 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@84 -- # trap - SIGINT SIGTERM EXIT 00:09:14.117 00:09:14.117 real 0m19.306s 00:09:14.117 user 0m27.120s 00:09:14.117 sys 0m1.954s 00:09:14.117 14:22:09 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1130 -- # xtrace_disable 00:09:14.117 14:22:09 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@10 -- # set +x 00:09:14.117 ************************************ 00:09:14.117 END TEST vfio_llvm_fuzz 00:09:14.117 ************************************ 00:09:14.117 00:09:14.117 real 1m23.241s 00:09:14.117 user 2m6.460s 00:09:14.117 sys 0m10.303s 00:09:14.117 14:22:10 llvm_fuzz -- common/autotest_common.sh@1130 -- # xtrace_disable 00:09:14.117 14:22:10 llvm_fuzz -- common/autotest_common.sh@10 -- # set +x 00:09:14.117 ************************************ 00:09:14.117 END TEST llvm_fuzz 00:09:14.117 ************************************ 00:09:14.117 14:22:10 -- spdk/autotest.sh@378 -- # [[ '' -eq 1 ]] 00:09:14.117 14:22:10 -- spdk/autotest.sh@385 -- # trap - SIGINT SIGTERM EXIT 00:09:14.117 14:22:10 -- spdk/autotest.sh@387 -- # timing_enter post_cleanup 00:09:14.117 14:22:10 -- common/autotest_common.sh@726 -- # xtrace_disable 00:09:14.117 14:22:10 -- common/autotest_common.sh@10 -- # set +x 00:09:14.117 14:22:10 -- spdk/autotest.sh@388 -- # autotest_cleanup 00:09:14.117 14:22:10 -- common/autotest_common.sh@1396 -- # local autotest_es=0 00:09:14.117 14:22:10 -- common/autotest_common.sh@1397 -- # xtrace_disable 00:09:14.117 14:22:10 -- common/autotest_common.sh@10 -- # set +x 00:09:20.700 INFO: APP EXITING 00:09:20.700 INFO: killing all VMs 00:09:20.700 INFO: killing vhost app 00:09:20.700 INFO: EXIT DONE 00:09:23.999 Waiting for block devices as requested 00:09:23.999 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:09:23.999 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:09:23.999 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:09:24.259 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:09:24.259 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:09:24.259 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:09:24.519 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:09:24.519 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:09:24.519 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:09:24.780 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:09:24.780 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:09:24.780 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:09:25.040 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:09:25.040 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:09:25.040 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:09:25.300 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:09:25.300 0000:d8:00.0 (8086 0a54): vfio-pci -> nvme 00:09:29.505 Cleaning 00:09:29.505 Removing: /dev/shm/spdk_tgt_trace.pid309087 00:09:29.505 Removing: /var/run/dpdk/spdk_pid306623 00:09:29.505 Removing: /var/run/dpdk/spdk_pid307745 00:09:29.505 Removing: /var/run/dpdk/spdk_pid309087 00:09:29.505 Removing: /var/run/dpdk/spdk_pid309553 00:09:29.505 Removing: /var/run/dpdk/spdk_pid310609 00:09:29.505 Removing: /var/run/dpdk/spdk_pid310657 00:09:29.505 Removing: /var/run/dpdk/spdk_pid311757 00:09:29.505 Removing: /var/run/dpdk/spdk_pid311772 00:09:29.505 Removing: /var/run/dpdk/spdk_pid312209 00:09:29.505 Removing: /var/run/dpdk/spdk_pid312526 00:09:29.505 Removing: /var/run/dpdk/spdk_pid312857 00:09:29.505 Removing: /var/run/dpdk/spdk_pid313074 00:09:29.505 Removing: /var/run/dpdk/spdk_pid313276 00:09:29.505 Removing: /var/run/dpdk/spdk_pid313562 00:09:29.505 Removing: /var/run/dpdk/spdk_pid313842 00:09:29.505 Removing: /var/run/dpdk/spdk_pid314164 00:09:29.505 Removing: /var/run/dpdk/spdk_pid314752 00:09:29.505 Removing: /var/run/dpdk/spdk_pid317922 00:09:29.505 Removing: /var/run/dpdk/spdk_pid318193 00:09:29.505 Removing: /var/run/dpdk/spdk_pid318310 00:09:29.505 Removing: /var/run/dpdk/spdk_pid318477 00:09:29.505 Removing: /var/run/dpdk/spdk_pid318813 00:09:29.505 Removing: /var/run/dpdk/spdk_pid318817 00:09:29.505 Removing: /var/run/dpdk/spdk_pid319384 00:09:29.505 Removing: /var/run/dpdk/spdk_pid319432 00:09:29.505 Removing: /var/run/dpdk/spdk_pid319824 00:09:29.505 Removing: /var/run/dpdk/spdk_pid319949 00:09:29.505 Removing: /var/run/dpdk/spdk_pid320081 00:09:29.505 Removing: /var/run/dpdk/spdk_pid320244 00:09:29.505 Removing: /var/run/dpdk/spdk_pid320640 00:09:29.505 Removing: /var/run/dpdk/spdk_pid320922 00:09:29.505 Removing: /var/run/dpdk/spdk_pid321202 00:09:29.505 Removing: /var/run/dpdk/spdk_pid321379 00:09:29.505 Removing: /var/run/dpdk/spdk_pid322036 00:09:29.505 Removing: /var/run/dpdk/spdk_pid322486 00:09:29.505 Removing: /var/run/dpdk/spdk_pid322857 00:09:29.505 Removing: /var/run/dpdk/spdk_pid323386 00:09:29.505 Removing: /var/run/dpdk/spdk_pid323690 00:09:29.505 Removing: /var/run/dpdk/spdk_pid324208 00:09:29.505 Removing: /var/run/dpdk/spdk_pid324800 00:09:29.505 Removing: /var/run/dpdk/spdk_pid325460 00:09:29.505 Removing: /var/run/dpdk/spdk_pid326121 00:09:29.505 Removing: /var/run/dpdk/spdk_pid326444 00:09:29.505 Removing: /var/run/dpdk/spdk_pid326932 00:09:29.505 Removing: /var/run/dpdk/spdk_pid327447 00:09:29.505 Removing: /var/run/dpdk/spdk_pid327756 00:09:29.505 Removing: /var/run/dpdk/spdk_pid328291 00:09:29.505 Removing: /var/run/dpdk/spdk_pid328712 00:09:29.505 Removing: /var/run/dpdk/spdk_pid329126 00:09:29.505 Removing: /var/run/dpdk/spdk_pid329659 00:09:29.505 Removing: /var/run/dpdk/spdk_pid329956 00:09:29.505 Removing: /var/run/dpdk/spdk_pid330481 00:09:29.505 Removing: /var/run/dpdk/spdk_pid331010 00:09:29.505 Removing: /var/run/dpdk/spdk_pid331316 00:09:29.505 Removing: /var/run/dpdk/spdk_pid331837 00:09:29.505 Removing: /var/run/dpdk/spdk_pid332323 00:09:29.505 Removing: /var/run/dpdk/spdk_pid332657 00:09:29.505 Removing: /var/run/dpdk/spdk_pid333196 00:09:29.505 Removing: /var/run/dpdk/spdk_pid333817 00:09:29.505 Removing: /var/run/dpdk/spdk_pid334351 00:09:29.505 Removing: /var/run/dpdk/spdk_pid334674 00:09:29.505 Removing: /var/run/dpdk/spdk_pid335179 00:09:29.505 Removing: /var/run/dpdk/spdk_pid335714 00:09:29.505 Removing: /var/run/dpdk/spdk_pid336144 00:09:29.505 Removing: /var/run/dpdk/spdk_pid336558 00:09:29.505 Clean 00:09:29.505 14:22:25 -- common/autotest_common.sh@1453 -- # return 0 00:09:29.505 14:22:25 -- spdk/autotest.sh@389 -- # timing_exit post_cleanup 00:09:29.505 14:22:25 -- common/autotest_common.sh@732 -- # xtrace_disable 00:09:29.505 14:22:25 -- common/autotest_common.sh@10 -- # set +x 00:09:29.505 14:22:25 -- spdk/autotest.sh@391 -- # timing_exit autotest 00:09:29.505 14:22:25 -- common/autotest_common.sh@732 -- # xtrace_disable 00:09:29.505 14:22:25 -- common/autotest_common.sh@10 -- # set +x 00:09:29.505 14:22:25 -- spdk/autotest.sh@392 -- # chmod a+r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/timing.txt 00:09:29.505 14:22:25 -- spdk/autotest.sh@394 -- # [[ -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/udev.log ]] 00:09:29.505 14:22:25 -- spdk/autotest.sh@394 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/udev.log 00:09:29.505 14:22:25 -- spdk/autotest.sh@396 -- # [[ y == y ]] 00:09:29.505 14:22:25 -- spdk/autotest.sh@398 -- # hostname 00:09:29.506 14:22:25 -- spdk/autotest.sh@398 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -c --no-external -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk -t spdk-wfp-20 -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_test.info 00:09:29.506 geninfo: WARNING: invalid characters removed from testname! 00:09:32.806 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/mdns_server.gcda 00:09:38.094 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_stubs.gcda 00:09:42.300 14:22:37 -- spdk/autotest.sh@399 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -a /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_base.info -a /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_test.info -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:09:50.430 14:22:45 -- spdk/autotest.sh@400 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info '*/dpdk/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:09:54.630 14:22:50 -- spdk/autotest.sh@404 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info --ignore-errors unused,unused '/usr/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:09:59.913 14:22:55 -- spdk/autotest.sh@405 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info '*/examples/vmd/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:10:05.201 14:23:00 -- spdk/autotest.sh@406 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_lspci/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:10:10.482 14:23:06 -- spdk/autotest.sh@407 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_top/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:10:15.770 14:23:11 -- spdk/autotest.sh@408 -- # rm -f cov_base.info cov_test.info OLD_STDOUT OLD_STDERR 00:10:15.770 14:23:11 -- spdk/autorun.sh@1 -- $ timing_finish 00:10:15.770 14:23:11 -- common/autotest_common.sh@738 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/timing.txt ]] 00:10:15.770 14:23:11 -- common/autotest_common.sh@740 -- $ flamegraph=/usr/local/FlameGraph/flamegraph.pl 00:10:15.770 14:23:11 -- common/autotest_common.sh@741 -- $ [[ -x /usr/local/FlameGraph/flamegraph.pl ]] 00:10:15.770 14:23:11 -- common/autotest_common.sh@744 -- $ /usr/local/FlameGraph/flamegraph.pl --title 'Build Timing' --nametype Step: --countname seconds /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/timing.txt 00:10:15.770 + [[ -n 179992 ]] 00:10:15.770 + sudo kill 179992 00:10:15.782 [Pipeline] } 00:10:15.799 [Pipeline] // stage 00:10:15.804 [Pipeline] } 00:10:15.821 [Pipeline] // timeout 00:10:15.826 [Pipeline] } 00:10:15.842 [Pipeline] // catchError 00:10:15.847 [Pipeline] } 00:10:15.864 [Pipeline] // wrap 00:10:15.870 [Pipeline] } 00:10:15.884 [Pipeline] // catchError 00:10:15.894 [Pipeline] stage 00:10:15.897 [Pipeline] { (Epilogue) 00:10:15.911 [Pipeline] catchError 00:10:15.913 [Pipeline] { 00:10:15.927 [Pipeline] echo 00:10:15.929 Cleanup processes 00:10:15.936 [Pipeline] sh 00:10:16.229 + sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:10:16.229 345241 sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:10:16.244 [Pipeline] sh 00:10:16.535 ++ sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:10:16.535 ++ grep -v 'sudo pgrep' 00:10:16.535 ++ awk '{print $1}' 00:10:16.535 + sudo kill -9 00:10:16.535 + true 00:10:16.548 [Pipeline] sh 00:10:16.838 + jbp/jenkins/jjb-config/jobs/scripts/compress_artifacts.sh 00:10:16.838 xz: Reduced the number of threads from 112 to 89 to not exceed the memory usage limit of 14,718 MiB 00:10:16.838 xz: Reduced the number of threads from 112 to 89 to not exceed the memory usage limit of 14,718 MiB 00:10:18.221 xz: Reduced the number of threads from 112 to 89 to not exceed the memory usage limit of 14,718 MiB 00:10:30.457 [Pipeline] sh 00:10:30.748 + jbp/jenkins/jjb-config/jobs/scripts/check_artifacts_size.sh 00:10:30.749 Artifacts sizes are good 00:10:30.764 [Pipeline] archiveArtifacts 00:10:30.772 Archiving artifacts 00:10:30.910 [Pipeline] sh 00:10:31.200 + sudo chown -R sys_sgci: /var/jenkins/workspace/short-fuzz-phy-autotest 00:10:31.216 [Pipeline] cleanWs 00:10:31.226 [WS-CLEANUP] Deleting project workspace... 00:10:31.226 [WS-CLEANUP] Deferred wipeout is used... 00:10:31.234 [WS-CLEANUP] done 00:10:31.236 [Pipeline] } 00:10:31.254 [Pipeline] // catchError 00:10:31.267 [Pipeline] sh 00:10:31.555 + logger -p user.info -t JENKINS-CI 00:10:31.566 [Pipeline] } 00:10:31.579 [Pipeline] // stage 00:10:31.585 [Pipeline] } 00:10:31.600 [Pipeline] // node 00:10:31.606 [Pipeline] End of Pipeline 00:10:31.652 Finished: SUCCESS