00:00:00.001 Started by upstream project "autotest-spdk-v24.01-LTS-vs-dpdk-v23.11" build number 598 00:00:00.001 originally caused by: 00:00:00.002 Started by upstream project "nightly-trigger" build number 3264 00:00:00.002 originally caused by: 00:00:00.002 Started by timer 00:00:00.002 Started by timer 00:00:00.021 Checking out git https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool into /var/jenkins_home/workspace/short-fuzz-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4 to read jbp/jenkins/jjb-config/jobs/autotest-downstream/autotest-phy.groovy 00:00:00.022 The recommended git tool is: git 00:00:00.022 using credential 00000000-0000-0000-0000-000000000002 00:00:00.024 > git rev-parse --resolve-git-dir /var/jenkins_home/workspace/short-fuzz-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4/jbp/.git # timeout=10 00:00:00.043 Fetching changes from the remote Git repository 00:00:00.046 > git config remote.origin.url https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool # timeout=10 00:00:00.064 Using shallow fetch with depth 1 00:00:00.064 Fetching upstream changes from https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool 00:00:00.064 > git --version # timeout=10 00:00:00.085 > git --version # 'git version 2.39.2' 00:00:00.086 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:00.118 Setting http proxy: proxy-dmz.intel.com:911 00:00:00.118 > git fetch --tags --force --progress --depth=1 -- https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool refs/heads/master # timeout=5 00:00:02.837 > git rev-parse origin/FETCH_HEAD^{commit} # timeout=10 00:00:02.847 > git rev-parse FETCH_HEAD^{commit} # timeout=10 00:00:02.859 Checking out Revision 9bf0dabeadcf84e29a3d5dbec2430e38aceadf8d (FETCH_HEAD) 00:00:02.859 > git config core.sparsecheckout # timeout=10 00:00:02.868 > git read-tree -mu HEAD # timeout=10 00:00:02.884 > git checkout -f 9bf0dabeadcf84e29a3d5dbec2430e38aceadf8d # timeout=5 00:00:02.899 Commit message: "inventory: add WCP3 to free inventory" 00:00:02.899 > git rev-list --no-walk 9bf0dabeadcf84e29a3d5dbec2430e38aceadf8d # timeout=10 00:00:03.026 [Pipeline] Start of Pipeline 00:00:03.041 [Pipeline] library 00:00:03.042 Loading library shm_lib@master 00:00:03.043 Library shm_lib@master is cached. Copying from home. 00:00:03.060 [Pipeline] node 00:00:03.074 Running on WFP20 in /var/jenkins/workspace/short-fuzz-phy-autotest 00:00:03.075 [Pipeline] { 00:00:03.082 [Pipeline] catchError 00:00:03.083 [Pipeline] { 00:00:03.093 [Pipeline] wrap 00:00:03.101 [Pipeline] { 00:00:03.106 [Pipeline] stage 00:00:03.108 [Pipeline] { (Prologue) 00:00:03.307 [Pipeline] sh 00:00:03.587 + logger -p user.info -t JENKINS-CI 00:00:03.604 [Pipeline] echo 00:00:03.606 Node: WFP20 00:00:03.611 [Pipeline] sh 00:00:03.902 [Pipeline] setCustomBuildProperty 00:00:03.913 [Pipeline] echo 00:00:03.914 Cleanup processes 00:00:03.918 [Pipeline] sh 00:00:04.195 + sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:04.195 1862517 sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:04.209 [Pipeline] sh 00:00:04.492 ++ sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:04.492 ++ grep -v 'sudo pgrep' 00:00:04.492 ++ awk '{print $1}' 00:00:04.492 + sudo kill -9 00:00:04.492 + true 00:00:04.508 [Pipeline] cleanWs 00:00:04.518 [WS-CLEANUP] Deleting project workspace... 00:00:04.518 [WS-CLEANUP] Deferred wipeout is used... 00:00:04.525 [WS-CLEANUP] done 00:00:04.531 [Pipeline] setCustomBuildProperty 00:00:04.548 [Pipeline] sh 00:00:04.834 + sudo git config --global --replace-all safe.directory '*' 00:00:04.920 [Pipeline] httpRequest 00:00:04.937 [Pipeline] echo 00:00:04.939 Sorcerer 10.211.164.101 is alive 00:00:04.946 [Pipeline] httpRequest 00:00:04.951 HttpMethod: GET 00:00:04.951 URL: http://10.211.164.101/packages/jbp_9bf0dabeadcf84e29a3d5dbec2430e38aceadf8d.tar.gz 00:00:04.952 Sending request to url: http://10.211.164.101/packages/jbp_9bf0dabeadcf84e29a3d5dbec2430e38aceadf8d.tar.gz 00:00:04.962 Response Code: HTTP/1.1 200 OK 00:00:04.963 Success: Status code 200 is in the accepted range: 200,404 00:00:04.963 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/jbp_9bf0dabeadcf84e29a3d5dbec2430e38aceadf8d.tar.gz 00:00:07.859 [Pipeline] sh 00:00:08.143 + tar --no-same-owner -xf jbp_9bf0dabeadcf84e29a3d5dbec2430e38aceadf8d.tar.gz 00:00:08.161 [Pipeline] httpRequest 00:00:08.187 [Pipeline] echo 00:00:08.188 Sorcerer 10.211.164.101 is alive 00:00:08.198 [Pipeline] httpRequest 00:00:08.203 HttpMethod: GET 00:00:08.203 URL: http://10.211.164.101/packages/spdk_4b94202c659be49093c32ec1d2d75efdacf00691.tar.gz 00:00:08.204 Sending request to url: http://10.211.164.101/packages/spdk_4b94202c659be49093c32ec1d2d75efdacf00691.tar.gz 00:00:08.228 Response Code: HTTP/1.1 200 OK 00:00:08.228 Success: Status code 200 is in the accepted range: 200,404 00:00:08.229 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk_4b94202c659be49093c32ec1d2d75efdacf00691.tar.gz 00:01:32.483 [Pipeline] sh 00:01:32.774 + tar --no-same-owner -xf spdk_4b94202c659be49093c32ec1d2d75efdacf00691.tar.gz 00:01:35.325 [Pipeline] sh 00:01:35.609 + git -C spdk log --oneline -n5 00:01:35.609 4b94202c6 lib/event: Bug fix for framework_set_scheduler 00:01:35.609 507e9ba07 nvme: add lock_depth for ctrlr_lock 00:01:35.609 62fda7b5f nvme: check pthread_mutex_destroy() return value 00:01:35.609 e03c164a1 nvme: add nvme_ctrlr_lock 00:01:35.609 d61f89a86 nvme/cuse: Add ctrlr_lock for cuse register and unregister 00:01:35.628 [Pipeline] withCredentials 00:01:35.638 > git --version # timeout=10 00:01:35.649 > git --version # 'git version 2.39.2' 00:01:35.665 Masking supported pattern matches of $GIT_PASSWORD or $GIT_ASKPASS 00:01:35.667 [Pipeline] { 00:01:35.676 [Pipeline] retry 00:01:35.678 [Pipeline] { 00:01:35.694 [Pipeline] sh 00:01:35.978 + git ls-remote http://dpdk.org/git/dpdk-stable v23.11 00:01:36.924 [Pipeline] } 00:01:36.947 [Pipeline] // retry 00:01:36.952 [Pipeline] } 00:01:36.973 [Pipeline] // withCredentials 00:01:36.982 [Pipeline] httpRequest 00:01:36.999 [Pipeline] echo 00:01:37.000 Sorcerer 10.211.164.101 is alive 00:01:37.008 [Pipeline] httpRequest 00:01:37.061 HttpMethod: GET 00:01:37.062 URL: http://10.211.164.101/packages/dpdk_d15625009dced269fcec27fc81dd74fd58d54cdb.tar.gz 00:01:37.062 Sending request to url: http://10.211.164.101/packages/dpdk_d15625009dced269fcec27fc81dd74fd58d54cdb.tar.gz 00:01:37.063 Response Code: HTTP/1.1 200 OK 00:01:37.063 Success: Status code 200 is in the accepted range: 200,404 00:01:37.064 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk_d15625009dced269fcec27fc81dd74fd58d54cdb.tar.gz 00:01:41.843 [Pipeline] sh 00:01:42.128 + tar --no-same-owner -xf dpdk_d15625009dced269fcec27fc81dd74fd58d54cdb.tar.gz 00:01:43.523 [Pipeline] sh 00:01:43.807 + git -C dpdk log --oneline -n5 00:01:43.807 eeb0605f11 version: 23.11.0 00:01:43.807 238778122a doc: update release notes for 23.11 00:01:43.807 46aa6b3cfc doc: fix description of RSS features 00:01:43.807 dd88f51a57 devtools: forbid DPDK API in cnxk base driver 00:01:43.807 7e421ae345 devtools: support skipping forbid rule check 00:01:43.819 [Pipeline] } 00:01:43.837 [Pipeline] // stage 00:01:43.846 [Pipeline] stage 00:01:43.849 [Pipeline] { (Prepare) 00:01:43.872 [Pipeline] writeFile 00:01:43.890 [Pipeline] sh 00:01:44.174 + logger -p user.info -t JENKINS-CI 00:01:44.187 [Pipeline] sh 00:01:44.472 + logger -p user.info -t JENKINS-CI 00:01:44.486 [Pipeline] sh 00:01:44.770 + cat autorun-spdk.conf 00:01:44.770 SPDK_RUN_FUNCTIONAL_TEST=1 00:01:44.770 SPDK_RUN_UBSAN=1 00:01:44.770 SPDK_TEST_FUZZER=1 00:01:44.770 SPDK_TEST_FUZZER_SHORT=1 00:01:44.770 SPDK_TEST_NATIVE_DPDK=v23.11 00:01:44.770 SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:44.777 RUN_NIGHTLY=1 00:01:44.783 [Pipeline] readFile 00:01:44.812 [Pipeline] withEnv 00:01:44.815 [Pipeline] { 00:01:44.830 [Pipeline] sh 00:01:45.186 + set -ex 00:01:45.186 + [[ -f /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf ]] 00:01:45.186 + source /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:01:45.186 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:01:45.186 ++ SPDK_RUN_UBSAN=1 00:01:45.186 ++ SPDK_TEST_FUZZER=1 00:01:45.186 ++ SPDK_TEST_FUZZER_SHORT=1 00:01:45.186 ++ SPDK_TEST_NATIVE_DPDK=v23.11 00:01:45.186 ++ SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:45.186 ++ RUN_NIGHTLY=1 00:01:45.186 + case $SPDK_TEST_NVMF_NICS in 00:01:45.186 + DRIVERS= 00:01:45.186 + [[ -n '' ]] 00:01:45.186 + exit 0 00:01:45.196 [Pipeline] } 00:01:45.216 [Pipeline] // withEnv 00:01:45.222 [Pipeline] } 00:01:45.235 [Pipeline] // stage 00:01:45.244 [Pipeline] catchError 00:01:45.246 [Pipeline] { 00:01:45.263 [Pipeline] timeout 00:01:45.263 Timeout set to expire in 30 min 00:01:45.265 [Pipeline] { 00:01:45.280 [Pipeline] stage 00:01:45.282 [Pipeline] { (Tests) 00:01:45.300 [Pipeline] sh 00:01:45.583 + jbp/jenkins/jjb-config/jobs/scripts/autoruner.sh /var/jenkins/workspace/short-fuzz-phy-autotest 00:01:45.583 ++ readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest 00:01:45.583 + DIR_ROOT=/var/jenkins/workspace/short-fuzz-phy-autotest 00:01:45.583 + [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest ]] 00:01:45.583 + DIR_SPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:45.583 + DIR_OUTPUT=/var/jenkins/workspace/short-fuzz-phy-autotest/output 00:01:45.583 + [[ -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk ]] 00:01:45.583 + [[ ! -d /var/jenkins/workspace/short-fuzz-phy-autotest/output ]] 00:01:45.583 + mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/output 00:01:45.583 + [[ -d /var/jenkins/workspace/short-fuzz-phy-autotest/output ]] 00:01:45.583 + [[ short-fuzz-phy-autotest == pkgdep-* ]] 00:01:45.583 + cd /var/jenkins/workspace/short-fuzz-phy-autotest 00:01:45.583 + source /etc/os-release 00:01:45.583 ++ NAME='Fedora Linux' 00:01:45.583 ++ VERSION='38 (Cloud Edition)' 00:01:45.583 ++ ID=fedora 00:01:45.583 ++ VERSION_ID=38 00:01:45.583 ++ VERSION_CODENAME= 00:01:45.583 ++ PLATFORM_ID=platform:f38 00:01:45.583 ++ PRETTY_NAME='Fedora Linux 38 (Cloud Edition)' 00:01:45.583 ++ ANSI_COLOR='0;38;2;60;110;180' 00:01:45.583 ++ LOGO=fedora-logo-icon 00:01:45.583 ++ CPE_NAME=cpe:/o:fedoraproject:fedora:38 00:01:45.583 ++ HOME_URL=https://fedoraproject.org/ 00:01:45.583 ++ DOCUMENTATION_URL=https://docs.fedoraproject.org/en-US/fedora/f38/system-administrators-guide/ 00:01:45.583 ++ SUPPORT_URL=https://ask.fedoraproject.org/ 00:01:45.583 ++ BUG_REPORT_URL=https://bugzilla.redhat.com/ 00:01:45.583 ++ REDHAT_BUGZILLA_PRODUCT=Fedora 00:01:45.583 ++ REDHAT_BUGZILLA_PRODUCT_VERSION=38 00:01:45.583 ++ REDHAT_SUPPORT_PRODUCT=Fedora 00:01:45.583 ++ REDHAT_SUPPORT_PRODUCT_VERSION=38 00:01:45.583 ++ SUPPORT_END=2024-05-14 00:01:45.583 ++ VARIANT='Cloud Edition' 00:01:45.583 ++ VARIANT_ID=cloud 00:01:45.583 + uname -a 00:01:45.583 Linux spdk-wfp-20 6.7.0-68.fc38.x86_64 #1 SMP PREEMPT_DYNAMIC Mon Jan 15 00:59:40 UTC 2024 x86_64 GNU/Linux 00:01:45.583 + sudo /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:01:48.870 Hugepages 00:01:48.870 node hugesize free / total 00:01:48.870 node0 1048576kB 0 / 0 00:01:48.870 node0 2048kB 0 / 0 00:01:48.870 node1 1048576kB 0 / 0 00:01:48.870 node1 2048kB 0 / 0 00:01:48.870 00:01:48.870 Type BDF Vendor Device NUMA Driver Device Block devices 00:01:48.870 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:01:48.870 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:01:48.870 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:01:48.870 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:01:48.870 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:01:48.870 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:01:48.870 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:01:48.870 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:01:48.870 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:01:48.870 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:01:48.870 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:01:48.870 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:01:48.870 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:01:48.870 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:01:48.870 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:01:48.870 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:01:48.870 NVMe 0000:d8:00.0 8086 0a54 1 nvme nvme0 nvme0n1 00:01:48.870 + rm -f /tmp/spdk-ld-path 00:01:48.870 + source autorun-spdk.conf 00:01:48.870 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:01:48.870 ++ SPDK_RUN_UBSAN=1 00:01:48.870 ++ SPDK_TEST_FUZZER=1 00:01:48.870 ++ SPDK_TEST_FUZZER_SHORT=1 00:01:48.870 ++ SPDK_TEST_NATIVE_DPDK=v23.11 00:01:48.870 ++ SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:48.870 ++ RUN_NIGHTLY=1 00:01:48.870 + (( SPDK_TEST_NVME_CMB == 1 || SPDK_TEST_NVME_PMR == 1 )) 00:01:48.870 + [[ -n '' ]] 00:01:48.870 + sudo git config --global --add safe.directory /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:48.870 + for M in /var/spdk/build-*-manifest.txt 00:01:48.870 + [[ -f /var/spdk/build-pkg-manifest.txt ]] 00:01:48.870 + cp /var/spdk/build-pkg-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:01:48.870 + for M in /var/spdk/build-*-manifest.txt 00:01:48.870 + [[ -f /var/spdk/build-repo-manifest.txt ]] 00:01:48.870 + cp /var/spdk/build-repo-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:01:48.870 ++ uname 00:01:48.870 + [[ Linux == \L\i\n\u\x ]] 00:01:48.870 + sudo dmesg -T 00:01:48.870 + sudo dmesg --clear 00:01:48.870 + dmesg_pid=1863883 00:01:48.870 + [[ Fedora Linux == FreeBSD ]] 00:01:48.870 + export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:01:48.870 + UNBIND_ENTIRE_IOMMU_GROUP=yes 00:01:48.870 + [[ -e /var/spdk/dependencies/vhost/spdk_test_image.qcow2 ]] 00:01:48.870 + [[ -x /usr/src/fio-static/fio ]] 00:01:48.870 + export FIO_BIN=/usr/src/fio-static/fio 00:01:48.870 + FIO_BIN=/usr/src/fio-static/fio 00:01:48.870 + sudo dmesg -Tw 00:01:48.870 + [[ '' == \/\v\a\r\/\j\e\n\k\i\n\s\/\w\o\r\k\s\p\a\c\e\/\s\h\o\r\t\-\f\u\z\z\-\p\h\y\-\a\u\t\o\t\e\s\t\/\q\e\m\u\_\v\f\i\o\/* ]] 00:01:48.870 + [[ ! -v VFIO_QEMU_BIN ]] 00:01:48.870 + [[ -e /usr/local/qemu/vfio-user-latest ]] 00:01:48.870 + export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:01:48.870 + VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:01:48.870 + [[ -e /usr/local/qemu/vanilla-latest ]] 00:01:48.870 + export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:01:48.870 + QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:01:48.870 + spdk/autorun.sh /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:01:48.870 Test configuration: 00:01:48.870 SPDK_RUN_FUNCTIONAL_TEST=1 00:01:48.870 SPDK_RUN_UBSAN=1 00:01:48.870 SPDK_TEST_FUZZER=1 00:01:48.870 SPDK_TEST_FUZZER_SHORT=1 00:01:48.870 SPDK_TEST_NATIVE_DPDK=v23.11 00:01:48.870 SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:48.870 RUN_NIGHTLY=1 10:33:05 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:01:48.870 10:33:05 -- scripts/common.sh@433 -- $ [[ -e /bin/wpdk_common.sh ]] 00:01:48.870 10:33:05 -- scripts/common.sh@441 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:01:48.870 10:33:05 -- scripts/common.sh@442 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:01:48.870 10:33:05 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:48.870 10:33:05 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:48.870 10:33:05 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:48.870 10:33:05 -- paths/export.sh@5 -- $ export PATH 00:01:48.870 10:33:05 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:48.870 10:33:05 -- common/autobuild_common.sh@434 -- $ out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:01:48.870 10:33:05 -- common/autobuild_common.sh@435 -- $ date +%s 00:01:48.870 10:33:05 -- common/autobuild_common.sh@435 -- $ mktemp -dt spdk_1720859585.XXXXXX 00:01:48.870 10:33:05 -- common/autobuild_common.sh@435 -- $ SPDK_WORKSPACE=/tmp/spdk_1720859585.g6tlw3 00:01:48.870 10:33:05 -- common/autobuild_common.sh@437 -- $ [[ -n '' ]] 00:01:48.870 10:33:05 -- common/autobuild_common.sh@441 -- $ '[' -n v23.11 ']' 00:01:48.870 10:33:05 -- common/autobuild_common.sh@442 -- $ dirname /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:48.870 10:33:05 -- common/autobuild_common.sh@442 -- $ scanbuild_exclude=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk' 00:01:48.870 10:33:05 -- common/autobuild_common.sh@448 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp' 00:01:48.870 10:33:05 -- common/autobuild_common.sh@450 -- $ scanbuild='scan-build -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:01:48.870 10:33:05 -- common/autobuild_common.sh@451 -- $ get_config_params 00:01:48.870 10:33:05 -- common/autotest_common.sh@387 -- $ xtrace_disable 00:01:48.870 10:33:05 -- common/autotest_common.sh@10 -- $ set +x 00:01:48.871 10:33:05 -- common/autobuild_common.sh@451 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user' 00:01:48.871 10:33:05 -- spdk/autobuild.sh@11 -- $ SPDK_TEST_AUTOBUILD= 00:01:48.871 10:33:05 -- spdk/autobuild.sh@12 -- $ umask 022 00:01:48.871 10:33:05 -- spdk/autobuild.sh@13 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:48.871 10:33:05 -- spdk/autobuild.sh@16 -- $ date -u 00:01:48.871 Sat Jul 13 08:33:05 AM UTC 2024 00:01:48.871 10:33:05 -- spdk/autobuild.sh@17 -- $ git describe --tags 00:01:48.871 LTS-59-g4b94202c6 00:01:48.871 10:33:05 -- spdk/autobuild.sh@19 -- $ '[' 0 -eq 1 ']' 00:01:48.871 10:33:05 -- spdk/autobuild.sh@23 -- $ '[' 1 -eq 1 ']' 00:01:48.871 10:33:05 -- spdk/autobuild.sh@24 -- $ run_test ubsan echo 'using ubsan' 00:01:48.871 10:33:05 -- common/autotest_common.sh@1077 -- $ '[' 3 -le 1 ']' 00:01:48.871 10:33:05 -- common/autotest_common.sh@1083 -- $ xtrace_disable 00:01:48.871 10:33:05 -- common/autotest_common.sh@10 -- $ set +x 00:01:48.871 ************************************ 00:01:48.871 START TEST ubsan 00:01:48.871 ************************************ 00:01:48.871 10:33:05 -- common/autotest_common.sh@1104 -- $ echo 'using ubsan' 00:01:48.871 using ubsan 00:01:48.871 00:01:48.871 real 0m0.000s 00:01:48.871 user 0m0.000s 00:01:48.871 sys 0m0.000s 00:01:48.871 10:33:05 -- common/autotest_common.sh@1105 -- $ xtrace_disable 00:01:48.871 10:33:05 -- common/autotest_common.sh@10 -- $ set +x 00:01:48.871 ************************************ 00:01:48.871 END TEST ubsan 00:01:48.871 ************************************ 00:01:48.871 10:33:05 -- spdk/autobuild.sh@27 -- $ '[' -n v23.11 ']' 00:01:48.871 10:33:05 -- spdk/autobuild.sh@28 -- $ build_native_dpdk 00:01:48.871 10:33:05 -- common/autobuild_common.sh@427 -- $ run_test build_native_dpdk _build_native_dpdk 00:01:48.871 10:33:05 -- common/autotest_common.sh@1077 -- $ '[' 2 -le 1 ']' 00:01:48.871 10:33:05 -- common/autotest_common.sh@1083 -- $ xtrace_disable 00:01:48.871 10:33:05 -- common/autotest_common.sh@10 -- $ set +x 00:01:48.871 ************************************ 00:01:48.871 START TEST build_native_dpdk 00:01:48.871 ************************************ 00:01:48.871 10:33:05 -- common/autotest_common.sh@1104 -- $ _build_native_dpdk 00:01:48.871 10:33:05 -- common/autobuild_common.sh@48 -- $ local external_dpdk_dir 00:01:48.871 10:33:05 -- common/autobuild_common.sh@49 -- $ local external_dpdk_base_dir 00:01:48.871 10:33:05 -- common/autobuild_common.sh@50 -- $ local compiler_version 00:01:48.871 10:33:05 -- common/autobuild_common.sh@51 -- $ local compiler 00:01:48.871 10:33:05 -- common/autobuild_common.sh@52 -- $ local dpdk_kmods 00:01:48.871 10:33:05 -- common/autobuild_common.sh@53 -- $ local repo=dpdk 00:01:48.871 10:33:05 -- common/autobuild_common.sh@55 -- $ compiler=gcc 00:01:48.871 10:33:05 -- common/autobuild_common.sh@61 -- $ export CC=gcc 00:01:48.871 10:33:05 -- common/autobuild_common.sh@61 -- $ CC=gcc 00:01:48.871 10:33:05 -- common/autobuild_common.sh@63 -- $ [[ gcc != *clang* ]] 00:01:48.871 10:33:05 -- common/autobuild_common.sh@63 -- $ [[ gcc != *gcc* ]] 00:01:48.871 10:33:05 -- common/autobuild_common.sh@68 -- $ gcc -dumpversion 00:01:48.871 10:33:05 -- common/autobuild_common.sh@68 -- $ compiler_version=13 00:01:48.871 10:33:05 -- common/autobuild_common.sh@69 -- $ compiler_version=13 00:01:48.871 10:33:05 -- common/autobuild_common.sh@70 -- $ external_dpdk_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:48.871 10:33:05 -- common/autobuild_common.sh@71 -- $ dirname /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:48.871 10:33:05 -- common/autobuild_common.sh@71 -- $ external_dpdk_base_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk 00:01:48.871 10:33:05 -- common/autobuild_common.sh@73 -- $ [[ ! -d /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk ]] 00:01:48.871 10:33:05 -- common/autobuild_common.sh@82 -- $ orgdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:48.871 10:33:05 -- common/autobuild_common.sh@83 -- $ git -C /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk log --oneline -n 5 00:01:48.871 eeb0605f11 version: 23.11.0 00:01:48.871 238778122a doc: update release notes for 23.11 00:01:48.871 46aa6b3cfc doc: fix description of RSS features 00:01:48.871 dd88f51a57 devtools: forbid DPDK API in cnxk base driver 00:01:48.871 7e421ae345 devtools: support skipping forbid rule check 00:01:48.871 10:33:05 -- common/autobuild_common.sh@85 -- $ dpdk_cflags='-fPIC -g -fcommon' 00:01:48.871 10:33:05 -- common/autobuild_common.sh@86 -- $ dpdk_ldflags= 00:01:48.871 10:33:05 -- common/autobuild_common.sh@87 -- $ dpdk_ver=23.11.0 00:01:48.871 10:33:05 -- common/autobuild_common.sh@89 -- $ [[ gcc == *gcc* ]] 00:01:48.871 10:33:05 -- common/autobuild_common.sh@89 -- $ [[ 13 -ge 5 ]] 00:01:48.871 10:33:05 -- common/autobuild_common.sh@90 -- $ dpdk_cflags+=' -Werror' 00:01:48.871 10:33:05 -- common/autobuild_common.sh@93 -- $ [[ gcc == *gcc* ]] 00:01:48.871 10:33:05 -- common/autobuild_common.sh@93 -- $ [[ 13 -ge 10 ]] 00:01:48.871 10:33:05 -- common/autobuild_common.sh@94 -- $ dpdk_cflags+=' -Wno-stringop-overflow' 00:01:48.871 10:33:05 -- common/autobuild_common.sh@100 -- $ DPDK_DRIVERS=("bus" "bus/pci" "bus/vdev" "mempool/ring" "net/i40e" "net/i40e/base") 00:01:48.871 10:33:05 -- common/autobuild_common.sh@102 -- $ local mlx5_libs_added=n 00:01:48.871 10:33:05 -- common/autobuild_common.sh@103 -- $ [[ 0 -eq 1 ]] 00:01:48.871 10:33:05 -- common/autobuild_common.sh@103 -- $ [[ 0 -eq 1 ]] 00:01:48.871 10:33:05 -- common/autobuild_common.sh@139 -- $ [[ 0 -eq 1 ]] 00:01:48.871 10:33:05 -- common/autobuild_common.sh@167 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk 00:01:48.871 10:33:05 -- common/autobuild_common.sh@168 -- $ uname -s 00:01:48.871 10:33:05 -- common/autobuild_common.sh@168 -- $ '[' Linux = Linux ']' 00:01:48.871 10:33:05 -- common/autobuild_common.sh@169 -- $ lt 23.11.0 21.11.0 00:01:48.871 10:33:05 -- scripts/common.sh@372 -- $ cmp_versions 23.11.0 '<' 21.11.0 00:01:48.871 10:33:05 -- scripts/common.sh@332 -- $ local ver1 ver1_l 00:01:48.871 10:33:05 -- scripts/common.sh@333 -- $ local ver2 ver2_l 00:01:48.871 10:33:05 -- scripts/common.sh@335 -- $ IFS=.-: 00:01:48.871 10:33:05 -- scripts/common.sh@335 -- $ read -ra ver1 00:01:48.871 10:33:05 -- scripts/common.sh@336 -- $ IFS=.-: 00:01:48.871 10:33:05 -- scripts/common.sh@336 -- $ read -ra ver2 00:01:48.871 10:33:05 -- scripts/common.sh@337 -- $ local 'op=<' 00:01:48.871 10:33:05 -- scripts/common.sh@339 -- $ ver1_l=3 00:01:48.871 10:33:05 -- scripts/common.sh@340 -- $ ver2_l=3 00:01:48.871 10:33:05 -- scripts/common.sh@342 -- $ local lt=0 gt=0 eq=0 v 00:01:48.871 10:33:05 -- scripts/common.sh@343 -- $ case "$op" in 00:01:48.871 10:33:05 -- scripts/common.sh@344 -- $ : 1 00:01:48.871 10:33:05 -- scripts/common.sh@363 -- $ (( v = 0 )) 00:01:48.871 10:33:05 -- scripts/common.sh@363 -- $ (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:01:48.871 10:33:05 -- scripts/common.sh@364 -- $ decimal 23 00:01:48.871 10:33:05 -- scripts/common.sh@352 -- $ local d=23 00:01:48.871 10:33:05 -- scripts/common.sh@353 -- $ [[ 23 =~ ^[0-9]+$ ]] 00:01:48.871 10:33:05 -- scripts/common.sh@354 -- $ echo 23 00:01:48.871 10:33:05 -- scripts/common.sh@364 -- $ ver1[v]=23 00:01:48.871 10:33:05 -- scripts/common.sh@365 -- $ decimal 21 00:01:48.871 10:33:05 -- scripts/common.sh@352 -- $ local d=21 00:01:48.871 10:33:05 -- scripts/common.sh@353 -- $ [[ 21 =~ ^[0-9]+$ ]] 00:01:48.871 10:33:05 -- scripts/common.sh@354 -- $ echo 21 00:01:48.871 10:33:05 -- scripts/common.sh@365 -- $ ver2[v]=21 00:01:48.871 10:33:05 -- scripts/common.sh@366 -- $ (( ver1[v] > ver2[v] )) 00:01:48.871 10:33:05 -- scripts/common.sh@366 -- $ return 1 00:01:48.871 10:33:05 -- common/autobuild_common.sh@173 -- $ patch -p1 00:01:48.871 patching file config/rte_config.h 00:01:48.871 Hunk #1 succeeded at 60 (offset 1 line). 00:01:48.871 10:33:05 -- common/autobuild_common.sh@177 -- $ dpdk_kmods=false 00:01:48.871 10:33:05 -- common/autobuild_common.sh@178 -- $ uname -s 00:01:48.871 10:33:05 -- common/autobuild_common.sh@178 -- $ '[' Linux = FreeBSD ']' 00:01:48.871 10:33:05 -- common/autobuild_common.sh@182 -- $ printf %s, bus bus/pci bus/vdev mempool/ring net/i40e net/i40e/base 00:01:48.871 10:33:05 -- common/autobuild_common.sh@182 -- $ meson build-tmp --prefix=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --libdir lib -Denable_docs=false -Denable_kmods=false -Dtests=false -Dc_link_args= '-Dc_args=-fPIC -g -fcommon -Werror -Wno-stringop-overflow' -Dmachine=native -Denable_drivers=bus,bus/pci,bus/vdev,mempool/ring,net/i40e,net/i40e/base, 00:01:53.061 The Meson build system 00:01:53.061 Version: 1.3.1 00:01:53.061 Source dir: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk 00:01:53.061 Build dir: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp 00:01:53.061 Build type: native build 00:01:53.061 Program cat found: YES (/usr/bin/cat) 00:01:53.061 Project name: DPDK 00:01:53.061 Project version: 23.11.0 00:01:53.061 C compiler for the host machine: gcc (gcc 13.2.1 "gcc (GCC) 13.2.1 20231011 (Red Hat 13.2.1-4)") 00:01:53.061 C linker for the host machine: gcc ld.bfd 2.39-16 00:01:53.061 Host machine cpu family: x86_64 00:01:53.061 Host machine cpu: x86_64 00:01:53.061 Message: ## Building in Developer Mode ## 00:01:53.061 Program pkg-config found: YES (/usr/bin/pkg-config) 00:01:53.061 Program check-symbols.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/buildtools/check-symbols.sh) 00:01:53.061 Program options-ibverbs-static.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/buildtools/options-ibverbs-static.sh) 00:01:53.061 Program python3 found: YES (/usr/bin/python3) 00:01:53.061 Program cat found: YES (/usr/bin/cat) 00:01:53.061 config/meson.build:113: WARNING: The "machine" option is deprecated. Please use "cpu_instruction_set" instead. 00:01:53.061 Compiler for C supports arguments -march=native: YES 00:01:53.061 Checking for size of "void *" : 8 00:01:53.061 Checking for size of "void *" : 8 (cached) 00:01:53.061 Library m found: YES 00:01:53.061 Library numa found: YES 00:01:53.061 Has header "numaif.h" : YES 00:01:53.061 Library fdt found: NO 00:01:53.061 Library execinfo found: NO 00:01:53.061 Has header "execinfo.h" : YES 00:01:53.061 Found pkg-config: YES (/usr/bin/pkg-config) 1.8.0 00:01:53.061 Run-time dependency libarchive found: NO (tried pkgconfig) 00:01:53.061 Run-time dependency libbsd found: NO (tried pkgconfig) 00:01:53.061 Run-time dependency jansson found: NO (tried pkgconfig) 00:01:53.061 Run-time dependency openssl found: YES 3.0.9 00:01:53.061 Run-time dependency libpcap found: YES 1.10.4 00:01:53.061 Has header "pcap.h" with dependency libpcap: YES 00:01:53.061 Compiler for C supports arguments -Wcast-qual: YES 00:01:53.061 Compiler for C supports arguments -Wdeprecated: YES 00:01:53.061 Compiler for C supports arguments -Wformat: YES 00:01:53.061 Compiler for C supports arguments -Wformat-nonliteral: NO 00:01:53.061 Compiler for C supports arguments -Wformat-security: NO 00:01:53.061 Compiler for C supports arguments -Wmissing-declarations: YES 00:01:53.061 Compiler for C supports arguments -Wmissing-prototypes: YES 00:01:53.061 Compiler for C supports arguments -Wnested-externs: YES 00:01:53.061 Compiler for C supports arguments -Wold-style-definition: YES 00:01:53.061 Compiler for C supports arguments -Wpointer-arith: YES 00:01:53.061 Compiler for C supports arguments -Wsign-compare: YES 00:01:53.061 Compiler for C supports arguments -Wstrict-prototypes: YES 00:01:53.061 Compiler for C supports arguments -Wundef: YES 00:01:53.061 Compiler for C supports arguments -Wwrite-strings: YES 00:01:53.061 Compiler for C supports arguments -Wno-address-of-packed-member: YES 00:01:53.061 Compiler for C supports arguments -Wno-packed-not-aligned: YES 00:01:53.061 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:01:53.061 Compiler for C supports arguments -Wno-zero-length-bounds: YES 00:01:53.061 Program objdump found: YES (/usr/bin/objdump) 00:01:53.061 Compiler for C supports arguments -mavx512f: YES 00:01:53.061 Checking if "AVX512 checking" compiles: YES 00:01:53.061 Fetching value of define "__SSE4_2__" : 1 00:01:53.061 Fetching value of define "__AES__" : 1 00:01:53.061 Fetching value of define "__AVX__" : 1 00:01:53.061 Fetching value of define "__AVX2__" : 1 00:01:53.061 Fetching value of define "__AVX512BW__" : 1 00:01:53.061 Fetching value of define "__AVX512CD__" : 1 00:01:53.061 Fetching value of define "__AVX512DQ__" : 1 00:01:53.061 Fetching value of define "__AVX512F__" : 1 00:01:53.061 Fetching value of define "__AVX512VL__" : 1 00:01:53.061 Fetching value of define "__PCLMUL__" : 1 00:01:53.061 Fetching value of define "__RDRND__" : 1 00:01:53.061 Fetching value of define "__RDSEED__" : 1 00:01:53.061 Fetching value of define "__VPCLMULQDQ__" : (undefined) 00:01:53.061 Fetching value of define "__znver1__" : (undefined) 00:01:53.061 Fetching value of define "__znver2__" : (undefined) 00:01:53.061 Fetching value of define "__znver3__" : (undefined) 00:01:53.061 Fetching value of define "__znver4__" : (undefined) 00:01:53.061 Compiler for C supports arguments -Wno-format-truncation: YES 00:01:53.061 Message: lib/log: Defining dependency "log" 00:01:53.061 Message: lib/kvargs: Defining dependency "kvargs" 00:01:53.061 Message: lib/telemetry: Defining dependency "telemetry" 00:01:53.061 Checking for function "getentropy" : NO 00:01:53.061 Message: lib/eal: Defining dependency "eal" 00:01:53.061 Message: lib/ring: Defining dependency "ring" 00:01:53.061 Message: lib/rcu: Defining dependency "rcu" 00:01:53.061 Message: lib/mempool: Defining dependency "mempool" 00:01:53.061 Message: lib/mbuf: Defining dependency "mbuf" 00:01:53.061 Fetching value of define "__PCLMUL__" : 1 (cached) 00:01:53.061 Fetching value of define "__AVX512F__" : 1 (cached) 00:01:53.061 Fetching value of define "__AVX512BW__" : 1 (cached) 00:01:53.061 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:01:53.061 Fetching value of define "__AVX512VL__" : 1 (cached) 00:01:53.061 Fetching value of define "__VPCLMULQDQ__" : (undefined) (cached) 00:01:53.061 Compiler for C supports arguments -mpclmul: YES 00:01:53.061 Compiler for C supports arguments -maes: YES 00:01:53.061 Compiler for C supports arguments -mavx512f: YES (cached) 00:01:53.061 Compiler for C supports arguments -mavx512bw: YES 00:01:53.061 Compiler for C supports arguments -mavx512dq: YES 00:01:53.061 Compiler for C supports arguments -mavx512vl: YES 00:01:53.061 Compiler for C supports arguments -mvpclmulqdq: YES 00:01:53.061 Compiler for C supports arguments -mavx2: YES 00:01:53.061 Compiler for C supports arguments -mavx: YES 00:01:53.061 Message: lib/net: Defining dependency "net" 00:01:53.061 Message: lib/meter: Defining dependency "meter" 00:01:53.061 Message: lib/ethdev: Defining dependency "ethdev" 00:01:53.061 Message: lib/pci: Defining dependency "pci" 00:01:53.061 Message: lib/cmdline: Defining dependency "cmdline" 00:01:53.061 Message: lib/metrics: Defining dependency "metrics" 00:01:53.061 Message: lib/hash: Defining dependency "hash" 00:01:53.061 Message: lib/timer: Defining dependency "timer" 00:01:53.061 Fetching value of define "__AVX512F__" : 1 (cached) 00:01:53.061 Fetching value of define "__AVX512VL__" : 1 (cached) 00:01:53.061 Fetching value of define "__AVX512CD__" : 1 (cached) 00:01:53.061 Fetching value of define "__AVX512BW__" : 1 (cached) 00:01:53.061 Message: lib/acl: Defining dependency "acl" 00:01:53.061 Message: lib/bbdev: Defining dependency "bbdev" 00:01:53.061 Message: lib/bitratestats: Defining dependency "bitratestats" 00:01:53.061 Run-time dependency libelf found: YES 0.190 00:01:53.061 Message: lib/bpf: Defining dependency "bpf" 00:01:53.061 Message: lib/cfgfile: Defining dependency "cfgfile" 00:01:53.061 Message: lib/compressdev: Defining dependency "compressdev" 00:01:53.061 Message: lib/cryptodev: Defining dependency "cryptodev" 00:01:53.061 Message: lib/distributor: Defining dependency "distributor" 00:01:53.061 Message: lib/dmadev: Defining dependency "dmadev" 00:01:53.061 Message: lib/efd: Defining dependency "efd" 00:01:53.061 Message: lib/eventdev: Defining dependency "eventdev" 00:01:53.061 Message: lib/dispatcher: Defining dependency "dispatcher" 00:01:53.061 Message: lib/gpudev: Defining dependency "gpudev" 00:01:53.061 Message: lib/gro: Defining dependency "gro" 00:01:53.061 Message: lib/gso: Defining dependency "gso" 00:01:53.061 Message: lib/ip_frag: Defining dependency "ip_frag" 00:01:53.061 Message: lib/jobstats: Defining dependency "jobstats" 00:01:53.061 Message: lib/latencystats: Defining dependency "latencystats" 00:01:53.061 Message: lib/lpm: Defining dependency "lpm" 00:01:53.061 Fetching value of define "__AVX512F__" : 1 (cached) 00:01:53.061 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:01:53.061 Fetching value of define "__AVX512IFMA__" : (undefined) 00:01:53.061 Compiler for C supports arguments -mavx512f -mavx512dq -mavx512ifma: YES 00:01:53.061 Message: lib/member: Defining dependency "member" 00:01:53.061 Message: lib/pcapng: Defining dependency "pcapng" 00:01:53.061 Compiler for C supports arguments -Wno-cast-qual: YES 00:01:53.061 Message: lib/power: Defining dependency "power" 00:01:53.061 Message: lib/rawdev: Defining dependency "rawdev" 00:01:53.061 Message: lib/regexdev: Defining dependency "regexdev" 00:01:53.061 Message: lib/mldev: Defining dependency "mldev" 00:01:53.061 Message: lib/rib: Defining dependency "rib" 00:01:53.061 Message: lib/reorder: Defining dependency "reorder" 00:01:53.061 Message: lib/sched: Defining dependency "sched" 00:01:53.061 Message: lib/security: Defining dependency "security" 00:01:53.061 Message: lib/stack: Defining dependency "stack" 00:01:53.061 Has header "linux/userfaultfd.h" : YES 00:01:53.061 Has header "linux/vduse.h" : YES 00:01:53.061 Message: lib/vhost: Defining dependency "vhost" 00:01:53.061 Message: lib/ipsec: Defining dependency "ipsec" 00:01:53.061 Message: lib/pdcp: Defining dependency "pdcp" 00:01:53.061 Fetching value of define "__AVX512F__" : 1 (cached) 00:01:53.061 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:01:53.061 Fetching value of define "__AVX512BW__" : 1 (cached) 00:01:53.061 Message: lib/fib: Defining dependency "fib" 00:01:53.061 Message: lib/port: Defining dependency "port" 00:01:53.061 Message: lib/pdump: Defining dependency "pdump" 00:01:53.061 Message: lib/table: Defining dependency "table" 00:01:53.061 Message: lib/pipeline: Defining dependency "pipeline" 00:01:53.061 Message: lib/graph: Defining dependency "graph" 00:01:53.061 Message: lib/node: Defining dependency "node" 00:01:53.061 Compiler for C supports arguments -Wno-format-truncation: YES (cached) 00:01:54.440 Message: drivers/bus/pci: Defining dependency "bus_pci" 00:01:54.440 Message: drivers/bus/vdev: Defining dependency "bus_vdev" 00:01:54.440 Message: drivers/mempool/ring: Defining dependency "mempool_ring" 00:01:54.440 Compiler for C supports arguments -Wno-sign-compare: YES 00:01:54.440 Compiler for C supports arguments -Wno-unused-value: YES 00:01:54.440 Compiler for C supports arguments -Wno-format: YES 00:01:54.440 Compiler for C supports arguments -Wno-format-security: YES 00:01:54.440 Compiler for C supports arguments -Wno-format-nonliteral: YES 00:01:54.441 Compiler for C supports arguments -Wno-strict-aliasing: YES 00:01:54.441 Compiler for C supports arguments -Wno-unused-but-set-variable: YES 00:01:54.441 Compiler for C supports arguments -Wno-unused-parameter: YES 00:01:54.441 Fetching value of define "__AVX512F__" : 1 (cached) 00:01:54.441 Fetching value of define "__AVX512BW__" : 1 (cached) 00:01:54.441 Compiler for C supports arguments -mavx512f: YES (cached) 00:01:54.441 Compiler for C supports arguments -mavx512bw: YES (cached) 00:01:54.441 Compiler for C supports arguments -march=skylake-avx512: YES 00:01:54.441 Message: drivers/net/i40e: Defining dependency "net_i40e" 00:01:54.441 Has header "sys/epoll.h" : YES 00:01:54.441 Program doxygen found: YES (/usr/bin/doxygen) 00:01:54.441 Configuring doxy-api-html.conf using configuration 00:01:54.441 Configuring doxy-api-man.conf using configuration 00:01:54.441 Program mandb found: YES (/usr/bin/mandb) 00:01:54.441 Program sphinx-build found: NO 00:01:54.441 Configuring rte_build_config.h using configuration 00:01:54.441 Message: 00:01:54.441 ================= 00:01:54.441 Applications Enabled 00:01:54.441 ================= 00:01:54.441 00:01:54.441 apps: 00:01:54.441 dumpcap, graph, pdump, proc-info, test-acl, test-bbdev, test-cmdline, test-compress-perf, 00:01:54.441 test-crypto-perf, test-dma-perf, test-eventdev, test-fib, test-flow-perf, test-gpudev, test-mldev, test-pipeline, 00:01:54.441 test-pmd, test-regex, test-sad, test-security-perf, 00:01:54.441 00:01:54.441 Message: 00:01:54.441 ================= 00:01:54.441 Libraries Enabled 00:01:54.441 ================= 00:01:54.441 00:01:54.441 libs: 00:01:54.441 log, kvargs, telemetry, eal, ring, rcu, mempool, mbuf, 00:01:54.441 net, meter, ethdev, pci, cmdline, metrics, hash, timer, 00:01:54.441 acl, bbdev, bitratestats, bpf, cfgfile, compressdev, cryptodev, distributor, 00:01:54.441 dmadev, efd, eventdev, dispatcher, gpudev, gro, gso, ip_frag, 00:01:54.441 jobstats, latencystats, lpm, member, pcapng, power, rawdev, regexdev, 00:01:54.441 mldev, rib, reorder, sched, security, stack, vhost, ipsec, 00:01:54.441 pdcp, fib, port, pdump, table, pipeline, graph, node, 00:01:54.441 00:01:54.441 00:01:54.441 Message: 00:01:54.441 =============== 00:01:54.441 Drivers Enabled 00:01:54.441 =============== 00:01:54.441 00:01:54.441 common: 00:01:54.441 00:01:54.441 bus: 00:01:54.441 pci, vdev, 00:01:54.441 mempool: 00:01:54.441 ring, 00:01:54.441 dma: 00:01:54.441 00:01:54.441 net: 00:01:54.441 i40e, 00:01:54.441 raw: 00:01:54.441 00:01:54.441 crypto: 00:01:54.441 00:01:54.441 compress: 00:01:54.441 00:01:54.441 regex: 00:01:54.441 00:01:54.441 ml: 00:01:54.441 00:01:54.441 vdpa: 00:01:54.441 00:01:54.441 event: 00:01:54.441 00:01:54.441 baseband: 00:01:54.441 00:01:54.441 gpu: 00:01:54.441 00:01:54.441 00:01:54.441 Message: 00:01:54.441 ================= 00:01:54.441 Content Skipped 00:01:54.441 ================= 00:01:54.441 00:01:54.441 apps: 00:01:54.441 00:01:54.441 libs: 00:01:54.441 00:01:54.441 drivers: 00:01:54.441 common/cpt: not in enabled drivers build config 00:01:54.441 common/dpaax: not in enabled drivers build config 00:01:54.441 common/iavf: not in enabled drivers build config 00:01:54.441 common/idpf: not in enabled drivers build config 00:01:54.441 common/mvep: not in enabled drivers build config 00:01:54.441 common/octeontx: not in enabled drivers build config 00:01:54.441 bus/auxiliary: not in enabled drivers build config 00:01:54.441 bus/cdx: not in enabled drivers build config 00:01:54.441 bus/dpaa: not in enabled drivers build config 00:01:54.441 bus/fslmc: not in enabled drivers build config 00:01:54.441 bus/ifpga: not in enabled drivers build config 00:01:54.441 bus/platform: not in enabled drivers build config 00:01:54.441 bus/vmbus: not in enabled drivers build config 00:01:54.441 common/cnxk: not in enabled drivers build config 00:01:54.441 common/mlx5: not in enabled drivers build config 00:01:54.441 common/nfp: not in enabled drivers build config 00:01:54.441 common/qat: not in enabled drivers build config 00:01:54.441 common/sfc_efx: not in enabled drivers build config 00:01:54.441 mempool/bucket: not in enabled drivers build config 00:01:54.441 mempool/cnxk: not in enabled drivers build config 00:01:54.441 mempool/dpaa: not in enabled drivers build config 00:01:54.441 mempool/dpaa2: not in enabled drivers build config 00:01:54.441 mempool/octeontx: not in enabled drivers build config 00:01:54.441 mempool/stack: not in enabled drivers build config 00:01:54.441 dma/cnxk: not in enabled drivers build config 00:01:54.441 dma/dpaa: not in enabled drivers build config 00:01:54.441 dma/dpaa2: not in enabled drivers build config 00:01:54.441 dma/hisilicon: not in enabled drivers build config 00:01:54.441 dma/idxd: not in enabled drivers build config 00:01:54.441 dma/ioat: not in enabled drivers build config 00:01:54.441 dma/skeleton: not in enabled drivers build config 00:01:54.441 net/af_packet: not in enabled drivers build config 00:01:54.441 net/af_xdp: not in enabled drivers build config 00:01:54.441 net/ark: not in enabled drivers build config 00:01:54.441 net/atlantic: not in enabled drivers build config 00:01:54.441 net/avp: not in enabled drivers build config 00:01:54.441 net/axgbe: not in enabled drivers build config 00:01:54.441 net/bnx2x: not in enabled drivers build config 00:01:54.441 net/bnxt: not in enabled drivers build config 00:01:54.441 net/bonding: not in enabled drivers build config 00:01:54.441 net/cnxk: not in enabled drivers build config 00:01:54.441 net/cpfl: not in enabled drivers build config 00:01:54.441 net/cxgbe: not in enabled drivers build config 00:01:54.441 net/dpaa: not in enabled drivers build config 00:01:54.441 net/dpaa2: not in enabled drivers build config 00:01:54.441 net/e1000: not in enabled drivers build config 00:01:54.441 net/ena: not in enabled drivers build config 00:01:54.441 net/enetc: not in enabled drivers build config 00:01:54.441 net/enetfec: not in enabled drivers build config 00:01:54.441 net/enic: not in enabled drivers build config 00:01:54.441 net/failsafe: not in enabled drivers build config 00:01:54.441 net/fm10k: not in enabled drivers build config 00:01:54.441 net/gve: not in enabled drivers build config 00:01:54.441 net/hinic: not in enabled drivers build config 00:01:54.441 net/hns3: not in enabled drivers build config 00:01:54.441 net/iavf: not in enabled drivers build config 00:01:54.441 net/ice: not in enabled drivers build config 00:01:54.441 net/idpf: not in enabled drivers build config 00:01:54.441 net/igc: not in enabled drivers build config 00:01:54.441 net/ionic: not in enabled drivers build config 00:01:54.441 net/ipn3ke: not in enabled drivers build config 00:01:54.441 net/ixgbe: not in enabled drivers build config 00:01:54.441 net/mana: not in enabled drivers build config 00:01:54.441 net/memif: not in enabled drivers build config 00:01:54.441 net/mlx4: not in enabled drivers build config 00:01:54.441 net/mlx5: not in enabled drivers build config 00:01:54.441 net/mvneta: not in enabled drivers build config 00:01:54.441 net/mvpp2: not in enabled drivers build config 00:01:54.441 net/netvsc: not in enabled drivers build config 00:01:54.441 net/nfb: not in enabled drivers build config 00:01:54.441 net/nfp: not in enabled drivers build config 00:01:54.441 net/ngbe: not in enabled drivers build config 00:01:54.441 net/null: not in enabled drivers build config 00:01:54.441 net/octeontx: not in enabled drivers build config 00:01:54.441 net/octeon_ep: not in enabled drivers build config 00:01:54.441 net/pcap: not in enabled drivers build config 00:01:54.441 net/pfe: not in enabled drivers build config 00:01:54.441 net/qede: not in enabled drivers build config 00:01:54.441 net/ring: not in enabled drivers build config 00:01:54.441 net/sfc: not in enabled drivers build config 00:01:54.441 net/softnic: not in enabled drivers build config 00:01:54.441 net/tap: not in enabled drivers build config 00:01:54.441 net/thunderx: not in enabled drivers build config 00:01:54.441 net/txgbe: not in enabled drivers build config 00:01:54.441 net/vdev_netvsc: not in enabled drivers build config 00:01:54.441 net/vhost: not in enabled drivers build config 00:01:54.441 net/virtio: not in enabled drivers build config 00:01:54.441 net/vmxnet3: not in enabled drivers build config 00:01:54.441 raw/cnxk_bphy: not in enabled drivers build config 00:01:54.441 raw/cnxk_gpio: not in enabled drivers build config 00:01:54.441 raw/dpaa2_cmdif: not in enabled drivers build config 00:01:54.441 raw/ifpga: not in enabled drivers build config 00:01:54.441 raw/ntb: not in enabled drivers build config 00:01:54.441 raw/skeleton: not in enabled drivers build config 00:01:54.441 crypto/armv8: not in enabled drivers build config 00:01:54.441 crypto/bcmfs: not in enabled drivers build config 00:01:54.441 crypto/caam_jr: not in enabled drivers build config 00:01:54.441 crypto/ccp: not in enabled drivers build config 00:01:54.441 crypto/cnxk: not in enabled drivers build config 00:01:54.441 crypto/dpaa_sec: not in enabled drivers build config 00:01:54.441 crypto/dpaa2_sec: not in enabled drivers build config 00:01:54.441 crypto/ipsec_mb: not in enabled drivers build config 00:01:54.441 crypto/mlx5: not in enabled drivers build config 00:01:54.441 crypto/mvsam: not in enabled drivers build config 00:01:54.441 crypto/nitrox: not in enabled drivers build config 00:01:54.441 crypto/null: not in enabled drivers build config 00:01:54.441 crypto/octeontx: not in enabled drivers build config 00:01:54.441 crypto/openssl: not in enabled drivers build config 00:01:54.441 crypto/scheduler: not in enabled drivers build config 00:01:54.441 crypto/uadk: not in enabled drivers build config 00:01:54.441 crypto/virtio: not in enabled drivers build config 00:01:54.441 compress/isal: not in enabled drivers build config 00:01:54.441 compress/mlx5: not in enabled drivers build config 00:01:54.441 compress/octeontx: not in enabled drivers build config 00:01:54.441 compress/zlib: not in enabled drivers build config 00:01:54.441 regex/mlx5: not in enabled drivers build config 00:01:54.441 regex/cn9k: not in enabled drivers build config 00:01:54.441 ml/cnxk: not in enabled drivers build config 00:01:54.441 vdpa/ifc: not in enabled drivers build config 00:01:54.441 vdpa/mlx5: not in enabled drivers build config 00:01:54.441 vdpa/nfp: not in enabled drivers build config 00:01:54.441 vdpa/sfc: not in enabled drivers build config 00:01:54.441 event/cnxk: not in enabled drivers build config 00:01:54.441 event/dlb2: not in enabled drivers build config 00:01:54.441 event/dpaa: not in enabled drivers build config 00:01:54.441 event/dpaa2: not in enabled drivers build config 00:01:54.441 event/dsw: not in enabled drivers build config 00:01:54.441 event/opdl: not in enabled drivers build config 00:01:54.441 event/skeleton: not in enabled drivers build config 00:01:54.441 event/sw: not in enabled drivers build config 00:01:54.442 event/octeontx: not in enabled drivers build config 00:01:54.442 baseband/acc: not in enabled drivers build config 00:01:54.442 baseband/fpga_5gnr_fec: not in enabled drivers build config 00:01:54.442 baseband/fpga_lte_fec: not in enabled drivers build config 00:01:54.442 baseband/la12xx: not in enabled drivers build config 00:01:54.442 baseband/null: not in enabled drivers build config 00:01:54.442 baseband/turbo_sw: not in enabled drivers build config 00:01:54.442 gpu/cuda: not in enabled drivers build config 00:01:54.442 00:01:54.442 00:01:54.442 Build targets in project: 217 00:01:54.442 00:01:54.442 DPDK 23.11.0 00:01:54.442 00:01:54.442 User defined options 00:01:54.442 libdir : lib 00:01:54.442 prefix : /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:54.442 c_args : -fPIC -g -fcommon -Werror -Wno-stringop-overflow 00:01:54.442 c_link_args : 00:01:54.442 enable_docs : false 00:01:54.442 enable_drivers: bus,bus/pci,bus/vdev,mempool/ring,net/i40e,net/i40e/base, 00:01:54.442 enable_kmods : false 00:01:54.442 machine : native 00:01:54.442 tests : false 00:01:54.442 00:01:54.442 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:01:54.442 WARNING: Running the setup command as `meson [options]` instead of `meson setup [options]` is ambiguous and deprecated. 00:01:54.442 10:33:10 -- common/autobuild_common.sh@186 -- $ ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp -j112 00:01:54.442 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp' 00:01:54.708 [1/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hypervisor.c.o 00:01:54.708 [2/707] Compiling C object lib/librte_log.a.p/log_log_linux.c.o 00:01:54.708 [3/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_errno.c.o 00:01:54.708 [4/707] Compiling C object lib/librte_eal.a.p/eal_common_rte_version.c.o 00:01:54.708 [5/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_cpuflags.c.o 00:01:54.708 [6/707] Compiling C object lib/librte_eal.a.p/eal_x86_rte_spinlock.c.o 00:01:54.708 [7/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_class.c.o 00:01:54.708 [8/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_debug.c.o 00:01:54.708 [9/707] Compiling C object lib/librte_eal.a.p/eal_x86_rte_hypervisor.c.o 00:01:54.708 [10/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hexdump.c.o 00:01:54.708 [11/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_string_fns.c.o 00:01:54.708 [12/707] Compiling C object lib/librte_eal.a.p/eal_common_rte_reciprocal.c.o 00:01:54.708 [13/707] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cpuflags.c.o 00:01:54.708 [14/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_cpuflags.c.o 00:01:54.708 [15/707] Compiling C object lib/librte_eal.a.p/eal_unix_eal_firmware.c.o 00:01:54.708 [16/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_uuid.c.o 00:01:54.708 [17/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio_mp_sync.c.o 00:01:54.970 [18/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_thread.c.o 00:01:54.970 [19/707] Compiling C object lib/librte_kvargs.a.p/kvargs_rte_kvargs.c.o 00:01:54.970 [20/707] Linking static target lib/librte_kvargs.a 00:01:54.970 [21/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_vt100.c.o 00:01:54.970 [22/707] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_data.c.o 00:01:54.970 [23/707] Compiling C object lib/librte_eal.a.p/eal_unix_eal_debug.c.o 00:01:54.970 [24/707] Compiling C object lib/librte_eal.a.p/eal_unix_rte_thread.c.o 00:01:54.970 [25/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_string.c.o 00:01:54.970 [26/707] Compiling C object lib/librte_log.a.p/log_log.c.o 00:01:54.970 [27/707] Compiling C object lib/librte_pci.a.p/pci_rte_pci.c.o 00:01:54.970 [28/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_socket.c.o 00:01:54.970 [29/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_os_unix.c.o 00:01:54.970 [30/707] Linking static target lib/librte_pci.a 00:01:54.970 [31/707] Linking static target lib/librte_log.a 00:01:54.970 [32/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_portlist.c.o 00:01:54.970 [33/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_num.c.o 00:01:54.970 [34/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_cirbuf.c.o 00:01:54.970 [35/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline.c.o 00:01:54.970 [36/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse.c.o 00:01:55.230 [37/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_rdline.c.o 00:01:55.230 [38/707] Generating lib/kvargs.sym_chk with a custom command (wrapped by meson to capture output) 00:01:55.230 [39/707] Generating lib/pci.sym_chk with a custom command (wrapped by meson to capture output) 00:01:55.230 [40/707] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_timer.c.o 00:01:55.230 [41/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_timer.c.o 00:01:55.230 [42/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_timer.c.o 00:01:55.230 [43/707] Compiling C object lib/librte_eal.a.p/eal_common_rte_keepalive.c.o 00:01:55.230 [44/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_lcore.c.o 00:01:55.230 [45/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_config.c.o 00:01:55.230 [46/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_tailqs.c.o 00:01:55.230 [47/707] Compiling C object lib/librte_eal.a.p/eal_unix_eal_file.c.o 00:01:55.230 [48/707] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_thread.c.o 00:01:55.230 [49/707] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cycles.c.o 00:01:55.230 [50/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_launch.c.o 00:01:55.230 [51/707] Compiling C object lib/librte_eal.a.p/eal_unix_eal_filesystem.c.o 00:01:55.230 [52/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_mcfg.c.o 00:01:55.498 [53/707] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_legacy.c.o 00:01:55.498 [54/707] Compiling C object lib/librte_net.a.p/net_net_crc_sse.c.o 00:01:55.498 [55/707] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_memory.c.o 00:01:55.498 [56/707] Compiling C object lib/librte_eal.a.p/eal_x86_rte_power_intrinsics.c.o 00:01:55.498 [57/707] Compiling C object lib/librte_eal.a.p/eal_common_rte_random.c.o 00:01:55.498 [58/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_bus.c.o 00:01:55.498 [59/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memalloc.c.o 00:01:55.498 [60/707] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_pool_ops.c.o 00:01:55.498 [61/707] Compiling C object lib/librte_meter.a.p/meter_rte_meter.c.o 00:01:55.498 [62/707] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_ptype.c.o 00:01:55.498 [63/707] Compiling C object lib/librte_ring.a.p/ring_rte_ring.c.o 00:01:55.498 [64/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_ctf.c.o 00:01:55.498 [65/707] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops_default.c.o 00:01:55.498 [66/707] Linking static target lib/librte_meter.a 00:01:55.498 [67/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_interrupts.c.o 00:01:55.498 [68/707] Compiling C object lib/librte_net.a.p/net_rte_net_crc.c.o 00:01:55.498 [69/707] Compiling C object lib/librte_eal.a.p/eal_common_hotplug_mp.c.o 00:01:55.498 [70/707] Linking static target lib/librte_ring.a 00:01:55.498 [71/707] Compiling C object lib/librte_acl.a.p/acl_tb_mem.c.o 00:01:55.498 [72/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_alarm.c.o 00:01:55.498 [73/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_etheraddr.c.o 00:01:55.498 [74/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace.c.o 00:01:55.498 [75/707] Compiling C object lib/librte_mempool.a.p/mempool_mempool_trace_points.c.o 00:01:55.498 [76/707] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_ipaddr.c.o 00:01:55.498 [77/707] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_profile.c.o 00:01:55.498 [78/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_dev.c.o 00:01:55.498 [79/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dev.c.o 00:01:55.498 [80/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_devargs.c.o 00:01:55.498 [81/707] Compiling C object lib/librte_hash.a.p/hash_rte_fbk_hash.c.o 00:01:55.498 [82/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_utils.c.o 00:01:55.498 [83/707] Compiling C object lib/librte_metrics.a.p/metrics_rte_metrics.c.o 00:01:55.498 [84/707] Linking static target lib/librte_cmdline.a 00:01:55.498 [85/707] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_telemetry.c.o 00:01:55.498 [86/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_thread.c.o 00:01:55.498 [87/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dynmem.c.o 00:01:55.498 [88/707] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops.c.o 00:01:55.498 [89/707] Compiling C object lib/librte_eal.a.p/eal_common_malloc_mp.c.o 00:01:55.498 [90/707] Compiling C object lib/librte_metrics.a.p/metrics_rte_metrics_telemetry.c.o 00:01:55.498 [91/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memzone.c.o 00:01:55.498 [92/707] Compiling C object lib/librte_net.a.p/net_rte_ether.c.o 00:01:55.498 [93/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_points.c.o 00:01:55.498 [94/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf_stub.c.o 00:01:55.498 [95/707] Compiling C object lib/librte_eal.a.p/eal_common_malloc_elem.c.o 00:01:55.498 [96/707] Linking static target lib/librte_metrics.a 00:01:55.498 [97/707] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_class_eth.c.o 00:01:55.499 [98/707] Compiling C object lib/net/libnet_crc_avx512_lib.a.p/net_crc_avx512.c.o 00:01:55.499 [99/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf.c.o 00:01:55.499 [100/707] Linking static target lib/net/libnet_crc_avx512_lib.a 00:01:55.499 [101/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_hugepage_info.c.o 00:01:55.499 [102/707] Compiling C object lib/librte_acl.a.p/acl_rte_acl.c.o 00:01:55.499 [103/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_lcore.c.o 00:01:55.499 [104/707] Compiling C object lib/librte_net.a.p/net_rte_net.c.o 00:01:55.499 [105/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf_load.c.o 00:01:55.499 [106/707] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8472.c.o 00:01:55.499 [107/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf_dump.c.o 00:01:55.499 [108/707] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_dyn.c.o 00:01:55.499 [109/707] Compiling C object lib/librte_net.a.p/net_rte_arp.c.o 00:01:55.760 [110/707] Compiling C object lib/librte_bitratestats.a.p/bitratestats_rte_bitrate.c.o 00:01:55.760 [111/707] Linking static target lib/librte_net.a 00:01:55.760 [112/707] Linking static target lib/librte_bitratestats.a 00:01:55.760 [113/707] Compiling C object lib/librte_cfgfile.a.p/cfgfile_rte_cfgfile.c.o 00:01:55.760 [114/707] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_private.c.o 00:01:55.760 [115/707] Generating lib/log.sym_chk with a custom command (wrapped by meson to capture output) 00:01:55.760 [116/707] Linking static target lib/librte_cfgfile.a 00:01:55.760 [117/707] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8079.c.o 00:01:55.760 [118/707] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_cman.c.o 00:01:55.760 [119/707] Linking target lib/librte_log.so.24.0 00:01:55.760 [120/707] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_common.c.o 00:01:55.760 [121/707] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_driver.c.o 00:01:55.760 [122/707] Compiling C object lib/librte_power.a.p/power_power_common.c.o 00:01:55.760 [123/707] Compiling C object lib/librte_power.a.p/power_guest_channel.c.o 00:01:55.760 [124/707] Compiling C object lib/librte_power.a.p/power_power_kvm_vm.c.o 00:01:55.760 [125/707] Compiling C object lib/librte_eal.a.p/eal_common_rte_malloc.c.o 00:01:55.760 [126/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_proc.c.o 00:01:55.760 [127/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal.c.o 00:01:55.760 [128/707] Compiling C object lib/librte_eal.a.p/eal_common_malloc_heap.c.o 00:01:55.760 [129/707] Generating lib/meter.sym_chk with a custom command (wrapped by meson to capture output) 00:01:55.760 [130/707] Compiling C object lib/librte_acl.a.p/acl_acl_gen.c.o 00:01:55.760 [131/707] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8636.c.o 00:01:55.760 [132/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_interrupts.c.o 00:01:55.760 [133/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_fbarray.c.o 00:01:55.760 [134/707] Generating lib/ring.sym_chk with a custom command (wrapped by meson to capture output) 00:01:55.760 [135/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memory.c.o 00:01:56.024 [136/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf_exec.c.o 00:01:56.024 [137/707] Compiling C object lib/librte_timer.a.p/timer_rte_timer.c.o 00:01:56.024 [138/707] Generating symbol file lib/librte_log.so.24.0.p/librte_log.so.24.0.symbols 00:01:56.024 [139/707] Linking static target lib/librte_timer.a 00:01:56.024 [140/707] Compiling C object lib/librte_acl.a.p/acl_acl_run_scalar.c.o 00:01:56.024 [141/707] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev_pmd.c.o 00:01:56.024 [142/707] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev_trace_points.c.o 00:01:56.024 [143/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memalloc.c.o 00:01:56.024 [144/707] Compiling C object lib/librte_distributor.a.p/distributor_rte_distributor_match_sse.c.o 00:01:56.024 [145/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf_load_elf.c.o 00:01:56.024 [146/707] Generating lib/bitratestats.sym_chk with a custom command (wrapped by meson to capture output) 00:01:56.024 [147/707] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_pmd.c.o 00:01:56.024 [148/707] Linking target lib/librte_kvargs.so.24.0 00:01:56.024 [149/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memory.c.o 00:01:56.024 [150/707] Generating lib/net.sym_chk with a custom command (wrapped by meson to capture output) 00:01:56.024 [151/707] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool.c.o 00:01:56.024 [152/707] Compiling C object lib/librte_eventdev.a.p/eventdev_eventdev_private.c.o 00:01:56.024 [153/707] Compiling C object lib/librte_eal.a.p/eal_common_rte_service.c.o 00:01:56.024 [154/707] Compiling C object lib/librte_bbdev.a.p/bbdev_rte_bbdev.c.o 00:01:56.024 [155/707] Compiling C object lib/librte_sched.a.p/sched_rte_approx.c.o 00:01:56.024 [156/707] Linking static target lib/librte_mempool.a 00:01:56.024 [157/707] Linking static target lib/librte_bbdev.a 00:01:56.024 [158/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf_convert.c.o 00:01:56.024 [159/707] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_mtr.c.o 00:01:56.024 [160/707] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio.c.o 00:01:56.024 [161/707] Compiling C object lib/librte_hash.a.p/hash_rte_thash.c.o 00:01:56.024 [162/707] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_ring.c.o 00:01:56.024 [163/707] Compiling C object lib/librte_distributor.a.p/distributor_rte_distributor_single.c.o 00:01:56.024 [164/707] Generating lib/metrics.sym_chk with a custom command (wrapped by meson to capture output) 00:01:56.024 [165/707] Compiling C object lib/librte_gso.a.p/gso_gso_udp4.c.o 00:01:56.024 [166/707] Compiling C object lib/librte_jobstats.a.p/jobstats_rte_jobstats.c.o 00:01:56.024 [167/707] Compiling C object lib/librte_gso.a.p/gso_gso_tcp4.c.o 00:01:56.024 [168/707] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_telemetry.c.o 00:01:56.024 [169/707] Compiling C object lib/librte_vhost.a.p/vhost_fd_man.c.o 00:01:56.024 [170/707] Linking static target lib/librte_jobstats.a 00:01:56.024 [171/707] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_comp.c.o 00:01:56.290 [172/707] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev.c.o 00:01:56.290 [173/707] Compiling C object lib/librte_gso.a.p/gso_gso_tunnel_udp4.c.o 00:01:56.290 [174/707] Linking static target lib/librte_compressdev.a 00:01:56.290 [175/707] Compiling C object lib/librte_gso.a.p/gso_gso_tunnel_tcp4.c.o 00:01:56.290 [176/707] Generating lib/cfgfile.sym_chk with a custom command (wrapped by meson to capture output) 00:01:56.290 [177/707] Compiling C object lib/librte_gso.a.p/gso_rte_gso.c.o 00:01:56.290 [178/707] Generating symbol file lib/librte_kvargs.so.24.0.p/librte_kvargs.so.24.0.symbols 00:01:56.290 [179/707] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv4_reassembly.c.o 00:01:56.290 [180/707] Compiling C object lib/librte_power.a.p/power_rte_power.c.o 00:01:56.290 [181/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf_pkt.c.o 00:01:56.290 [182/707] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_trace_points.c.o 00:01:56.290 [183/707] Compiling C object lib/librte_power.a.p/power_rte_power_uncore.c.o 00:01:56.290 [184/707] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv6_reassembly.c.o 00:01:56.290 [185/707] Compiling C object lib/librte_member.a.p/member_rte_member.c.o 00:01:56.290 [186/707] Compiling C object lib/librte_dispatcher.a.p/dispatcher_rte_dispatcher.c.o 00:01:56.290 [187/707] Compiling C object lib/librte_mldev.a.p/mldev_mldev_utils.c.o 00:01:56.290 [188/707] Compiling C object lib/librte_gro.a.p/gro_gro_tcp6.c.o 00:01:56.290 [189/707] Compiling C object lib/librte_mldev.a.p/mldev_rte_mldev_pmd.c.o 00:01:56.290 [190/707] Linking static target lib/librte_dispatcher.a 00:01:56.290 [191/707] Compiling C object lib/librte_gro.a.p/gro_gro_tcp4.c.o 00:01:56.290 [192/707] Compiling C object lib/member/libsketch_avx512_tmp.a.p/rte_member_sketch_avx512.c.o 00:01:56.290 [193/707] Linking static target lib/member/libsketch_avx512_tmp.a 00:01:56.290 [194/707] Compiling C object lib/librte_latencystats.a.p/latencystats_rte_latencystats.c.o 00:01:56.290 [195/707] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_tm.c.o 00:01:56.290 [196/707] Compiling C object lib/librte_mldev.a.p/mldev_mldev_utils_scalar_bfloat16.c.o 00:01:56.290 [197/707] Compiling C object lib/librte_stack.a.p/stack_rte_stack_std.c.o 00:01:56.290 [198/707] Linking static target lib/librte_latencystats.a 00:01:56.290 [199/707] Compiling C object lib/librte_gro.a.p/gro_gro_udp4.c.o 00:01:56.290 [200/707] Compiling C object lib/librte_power.a.p/power_power_acpi_cpufreq.c.o 00:01:56.290 [201/707] Compiling C object lib/librte_power.a.p/power_power_intel_uncore.c.o 00:01:56.290 [202/707] Compiling C object lib/librte_sched.a.p/sched_rte_pie.c.o 00:01:56.290 [203/707] Compiling C object lib/librte_sched.a.p/sched_rte_red.c.o 00:01:56.555 [204/707] Compiling C object lib/librte_power.a.p/power_power_amd_pstate_cpufreq.c.o 00:01:56.555 [205/707] Compiling C object lib/librte_gro.a.p/gro_rte_gro.c.o 00:01:56.555 [206/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf_validate.c.o 00:01:56.555 [207/707] Generating lib/timer.sym_chk with a custom command (wrapped by meson to capture output) 00:01:56.555 [208/707] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_options.c.o 00:01:56.555 [209/707] Compiling C object lib/librte_stack.a.p/stack_rte_stack.c.o 00:01:56.555 [210/707] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry.c.o 00:01:56.555 [211/707] Compiling C object lib/librte_gro.a.p/gro_gro_vxlan_tcp4.c.o 00:01:56.555 [212/707] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ip_frag_common.c.o 00:01:56.555 [213/707] Compiling C object lib/librte_rcu.a.p/rcu_rte_rcu_qsbr.c.o 00:01:56.555 [214/707] Compiling C object lib/librte_gro.a.p/gro_gro_vxlan_udp4.c.o 00:01:56.555 [215/707] Compiling C object lib/librte_power.a.p/power_power_cppc_cpufreq.c.o 00:01:56.555 [216/707] Compiling C object lib/librte_stack.a.p/stack_rte_stack_lf.c.o 00:01:56.555 [217/707] Linking static target lib/librte_telemetry.a 00:01:56.555 [218/707] Linking static target lib/librte_stack.a 00:01:56.555 [219/707] Compiling C object lib/librte_eventdev.a.p/eventdev_eventdev_trace_points.c.o 00:01:56.555 [220/707] Linking static target lib/librte_rcu.a 00:01:56.555 [221/707] Linking static target lib/librte_gro.a 00:01:56.555 [222/707] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev.c.o 00:01:56.555 [223/707] Compiling C object lib/librte_gpudev.a.p/gpudev_gpudev.c.o 00:01:56.555 [224/707] Linking static target lib/librte_eal.a 00:01:56.555 [225/707] Compiling C object lib/librte_ip_frag.a.p/ip_frag_ip_frag_internal.c.o 00:01:56.555 [226/707] Linking static target lib/librte_dmadev.a 00:01:56.555 [227/707] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv6_fragmentation.c.o 00:01:56.555 [228/707] Linking static target lib/librte_gpudev.a 00:01:56.555 [229/707] Compiling C object lib/librte_lpm.a.p/lpm_rte_lpm.c.o 00:01:56.555 [230/707] Compiling C object lib/librte_distributor.a.p/distributor_rte_distributor.c.o 00:01:56.555 [231/707] Compiling C object lib/librte_gso.a.p/gso_gso_common.c.o 00:01:56.555 [232/707] Linking static target lib/librte_distributor.a 00:01:56.555 [233/707] Compiling C object lib/librte_power.a.p/power_rte_power_pmd_mgmt.c.o 00:01:56.555 [234/707] Compiling C object lib/librte_regexdev.a.p/regexdev_rte_regexdev.c.o 00:01:56.555 [235/707] Linking static target lib/librte_gso.a 00:01:56.555 [236/707] Linking static target lib/librte_regexdev.a 00:01:56.555 [237/707] Compiling C object lib/librte_mldev.a.p/mldev_rte_mldev.c.o 00:01:56.555 [238/707] Compiling C object lib/librte_power.a.p/power_power_pstate_cpufreq.c.o 00:01:56.555 [239/707] Compiling C object lib/librte_mldev.a.p/mldev_mldev_utils_scalar.c.o 00:01:56.555 [240/707] Compiling C object lib/librte_rawdev.a.p/rawdev_rte_rawdev.c.o 00:01:56.555 [241/707] Linking static target lib/librte_power.a 00:01:56.555 [242/707] Linking static target lib/librte_mldev.a 00:01:56.555 [243/707] Linking static target lib/librte_rawdev.a 00:01:56.555 [244/707] Generating lib/jobstats.sym_chk with a custom command (wrapped by meson to capture output) 00:01:56.555 [245/707] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv4_fragmentation.c.o 00:01:56.555 [246/707] Compiling C object lib/librte_table.a.p/table_rte_swx_keycmp.c.o 00:01:56.555 [247/707] Linking static target lib/librte_ip_frag.a 00:01:56.555 [248/707] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf.c.o 00:01:56.555 [249/707] Compiling C object lib/librte_rib.a.p/rib_rte_rib.c.o 00:01:56.819 [250/707] Linking static target lib/librte_mbuf.a 00:01:56.820 [251/707] Compiling C object lib/librte_pcapng.a.p/pcapng_rte_pcapng.c.o 00:01:56.820 [252/707] Compiling C object lib/librte_ipsec.a.p/ipsec_ses.c.o 00:01:56.820 [253/707] Linking static target lib/librte_pcapng.a 00:01:56.820 [254/707] Compiling C object lib/librte_pdcp.a.p/pdcp_pdcp_reorder.c.o 00:01:56.820 [255/707] Compiling C object lib/librte_bpf.a.p/bpf_bpf_jit_x86.c.o 00:01:56.820 [256/707] Compiling C object lib/librte_reorder.a.p/reorder_rte_reorder.c.o 00:01:56.820 [257/707] Generating lib/cmdline.sym_chk with a custom command (wrapped by meson to capture output) 00:01:56.820 [258/707] Generating lib/latencystats.sym_chk with a custom command (wrapped by meson to capture output) 00:01:56.820 [259/707] Compiling C object lib/librte_vhost.a.p/vhost_vdpa.c.o 00:01:56.820 [260/707] Linking static target lib/librte_bpf.a 00:01:56.820 [261/707] Linking static target lib/librte_reorder.a 00:01:56.820 [262/707] Generating lib/stack.sym_chk with a custom command (wrapped by meson to capture output) 00:01:56.820 [263/707] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_dma_adapter.c.o 00:01:56.820 [264/707] Compiling C object lib/librte_fib.a.p/fib_rte_fib.c.o 00:01:56.820 [265/707] Compiling C object lib/librte_ipsec.a.p/ipsec_ipsec_telemetry.c.o 00:01:56.820 [266/707] Compiling C object lib/librte_pdcp.a.p/pdcp_pdcp_ctrl_pdu.c.o 00:01:56.820 [267/707] Compiling C object lib/librte_fib.a.p/fib_rte_fib6.c.o 00:01:56.820 [268/707] Compiling C object lib/librte_pdcp.a.p/pdcp_pdcp_crypto.c.o 00:01:56.820 [269/707] Generating lib/gro.sym_chk with a custom command (wrapped by meson to capture output) 00:01:56.820 [270/707] Generating lib/gso.sym_chk with a custom command (wrapped by meson to capture output) 00:01:56.820 [271/707] Compiling C object lib/librte_security.a.p/security_rte_security.c.o 00:01:56.820 [272/707] Compiling C object lib/librte_vhost.a.p/vhost_iotlb.c.o 00:01:56.820 [273/707] Linking static target lib/librte_security.a 00:01:56.820 [274/707] Generating lib/rcu.sym_chk with a custom command (wrapped by meson to capture output) 00:01:56.820 [275/707] Generating lib/distributor.sym_chk with a custom command (wrapped by meson to capture output) 00:01:56.820 [276/707] Compiling C object lib/librte_pdcp.a.p/pdcp_pdcp_cnt.c.o 00:01:57.079 [277/707] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_trace_points.c.o 00:01:57.079 [278/707] Compiling C object lib/librte_port.a.p/port_rte_port_sched.c.o 00:01:57.079 [279/707] Generating lib/bbdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:57.079 [280/707] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net_ctrl.c.o 00:01:57.079 [281/707] Generating lib/compressdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:57.079 [282/707] Compiling C object lib/librte_vhost.a.p/vhost_vduse.c.o 00:01:57.079 [283/707] Compiling C object lib/librte_member.a.p/member_rte_member_vbf.c.o 00:01:57.079 [284/707] Compiling C object lib/librte_fib.a.p/fib_trie_avx512.c.o 00:01:57.079 [285/707] Compiling C object lib/librte_fib.a.p/fib_dir24_8_avx512.c.o 00:01:57.079 [286/707] Generating lib/mempool.sym_chk with a custom command (wrapped by meson to capture output) 00:01:57.079 [287/707] Generating lib/dispatcher.sym_chk with a custom command (wrapped by meson to capture output) 00:01:57.079 [288/707] Compiling C object lib/librte_node.a.p/node_null.c.o 00:01:57.079 [289/707] Compiling C object lib/librte_ipsec.a.p/ipsec_sa.c.o 00:01:57.079 [290/707] Generating lib/ip_frag.sym_chk with a custom command (wrapped by meson to capture output) 00:01:57.079 [291/707] Compiling C object lib/librte_lpm.a.p/lpm_rte_lpm6.c.o 00:01:57.079 [292/707] Linking static target lib/librte_lpm.a 00:01:57.079 [293/707] Generating lib/dmadev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:57.079 [294/707] Generating lib/pcapng.sym_chk with a custom command (wrapped by meson to capture output) 00:01:57.079 [295/707] Compiling C object lib/librte_vhost.a.p/vhost_socket.c.o 00:01:57.079 [296/707] Compiling C object lib/librte_table.a.p/table_rte_table_stub.c.o 00:01:57.079 [297/707] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_params.c.o 00:01:57.079 [298/707] Compiling C object lib/librte_pdcp.a.p/pdcp_rte_pdcp.c.o 00:01:57.079 [299/707] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev_params.c.o 00:01:57.079 [300/707] Compiling C object lib/librte_table.a.p/table_rte_table_array.c.o 00:01:57.079 [301/707] Generating lib/telemetry.sym_chk with a custom command (wrapped by meson to capture output) 00:01:57.079 [302/707] Compiling C object lib/librte_rib.a.p/rib_rte_rib6.c.o 00:01:57.079 [303/707] Compiling C object lib/librte_member.a.p/member_rte_member_ht.c.o 00:01:57.079 [304/707] Linking static target lib/librte_rib.a 00:01:57.342 [305/707] Compiling C object lib/librte_table.a.p/table_rte_swx_table_wm.c.o 00:01:57.342 [306/707] Compiling C object lib/librte_table.a.p/table_rte_table_hash_cuckoo.c.o 00:01:57.342 [307/707] Linking target lib/librte_telemetry.so.24.0 00:01:57.342 [308/707] Compiling C object lib/librte_table.a.p/table_rte_swx_table_selector.c.o 00:01:57.342 [309/707] Compiling C object lib/librte_table.a.p/table_rte_table_lpm_ipv6.c.o 00:01:57.342 [310/707] Compiling C object lib/librte_table.a.p/table_rte_table_lpm.c.o 00:01:57.342 [311/707] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_timer_adapter.c.o 00:01:57.342 [312/707] Generating lib/bpf.sym_chk with a custom command (wrapped by meson to capture output) 00:01:57.342 [313/707] Generating lib/reorder.sym_chk with a custom command (wrapped by meson to capture output) 00:01:57.342 [314/707] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_crypto_adapter.c.o 00:01:57.342 [315/707] Compiling C object lib/librte_port.a.p/port_rte_port_ras.c.o 00:01:57.342 [316/707] Compiling C object lib/librte_port.a.p/port_rte_port_frag.c.o 00:01:57.342 [317/707] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_port_in_action.c.o 00:01:57.342 [318/707] Compiling C object lib/librte_table.a.p/table_rte_swx_table_learner.c.o 00:01:57.342 [319/707] Compiling C object lib/librte_acl.a.p/acl_acl_bld.c.o 00:01:57.342 [320/707] Generating lib/rawdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:57.342 [321/707] Compiling C object lib/librte_port.a.p/port_rte_port_fd.c.o 00:01:57.342 [322/707] Compiling C object lib/librte_graph.a.p/graph_graph_debug.c.o 00:01:57.342 [323/707] Compiling C object lib/librte_graph.a.p/graph_graph_ops.c.o 00:01:57.342 [324/707] Compiling C object lib/librte_table.a.p/table_rte_swx_table_em.c.o 00:01:57.342 [325/707] Compiling C object lib/librte_port.a.p/port_rte_swx_port_fd.c.o 00:01:57.342 [326/707] Compiling C object lib/librte_port.a.p/port_rte_swx_port_ethdev.c.o 00:01:57.342 [327/707] Compiling C object lib/librte_graph.a.p/graph_rte_graph_worker.c.o 00:01:57.342 [328/707] Compiling C object lib/librte_port.a.p/port_rte_port_ethdev.c.o 00:01:57.342 [329/707] Compiling C object lib/librte_efd.a.p/efd_rte_efd.c.o 00:01:57.342 [330/707] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_eth_tx_adapter.c.o 00:01:57.342 [331/707] Linking static target lib/librte_efd.a 00:01:57.342 [332/707] Compiling C object lib/librte_port.a.p/port_rte_port_sym_crypto.c.o 00:01:57.342 [333/707] Generating symbol file lib/librte_telemetry.so.24.0.p/librte_telemetry.so.24.0.symbols 00:01:57.604 [334/707] Compiling C object lib/librte_table.a.p/table_rte_table_acl.c.o 00:01:57.604 [335/707] Compiling C object lib/librte_graph.a.p/graph_node.c.o 00:01:57.604 [336/707] Compiling C object lib/librte_graph.a.p/graph_graph_populate.c.o 00:01:57.604 [337/707] Compiling C object lib/librte_fib.a.p/fib_trie.c.o 00:01:57.604 [338/707] Generating lib/mbuf.sym_chk with a custom command (wrapped by meson to capture output) 00:01:57.604 [339/707] Generating lib/security.sym_chk with a custom command (wrapped by meson to capture output) 00:01:57.604 [340/707] Compiling C object lib/librte_port.a.p/port_rte_port_eventdev.c.o 00:01:57.604 [341/707] Compiling C object lib/librte_graph.a.p/graph_graph_pcap.c.o 00:01:57.604 [342/707] Compiling C object lib/librte_port.a.p/port_rte_swx_port_source_sink.c.o 00:01:57.604 [343/707] Compiling C object lib/librte_port.a.p/port_rte_port_source_sink.c.o 00:01:57.604 [344/707] Compiling C object lib/librte_node.a.p/node_ethdev_ctrl.c.o 00:01:57.604 [345/707] Compiling C object lib/librte_node.a.p/node_log.c.o 00:01:57.604 [346/707] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_flow.c.o 00:01:57.604 [347/707] Generating lib/lpm.sym_chk with a custom command (wrapped by meson to capture output) 00:01:57.604 [348/707] Compiling C object lib/librte_node.a.p/node_pkt_drop.c.o 00:01:57.604 [349/707] Compiling C object lib/librte_graph.a.p/graph_graph_stats.c.o 00:01:57.604 [350/707] Compiling C object app/dpdk-test-cmdline.p/test-cmdline_cmdline_test.c.o 00:01:57.604 [351/707] Compiling C object lib/librte_graph.a.p/graph_graph.c.o 00:01:57.604 [352/707] Compiling C object app/dpdk-test-cmdline.p/test-cmdline_commands.c.o 00:01:57.604 [353/707] Generating lib/power.sym_chk with a custom command (wrapped by meson to capture output) 00:01:57.604 [354/707] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common_uio.c.o 00:01:57.868 [355/707] Compiling C object lib/librte_node.a.p/node_ethdev_tx.c.o 00:01:57.868 [356/707] Generating lib/regexdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:57.868 [357/707] Compiling C object lib/librte_fib.a.p/fib_dir24_8.c.o 00:01:57.868 [358/707] Compiling C object lib/librte_port.a.p/port_rte_swx_port_ring.c.o 00:01:57.868 [359/707] Linking static target lib/librte_fib.a 00:01:57.868 [360/707] Compiling C object lib/librte_node.a.p/node_kernel_tx.c.o 00:01:57.868 [361/707] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_eventdev.c.o 00:01:57.868 [362/707] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_diag.c.o 00:01:57.868 [363/707] Generating lib/efd.sym_chk with a custom command (wrapped by meson to capture output) 00:01:57.868 [364/707] Compiling C object lib/librte_ipsec.a.p/ipsec_ipsec_sad.c.o 00:01:57.868 [365/707] Compiling C object lib/librte_node.a.p/node_ip4_reassembly.c.o 00:01:57.868 [366/707] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci.c.o 00:01:57.868 [367/707] Compiling C object lib/librte_node.a.p/node_ethdev_rx.c.o 00:01:57.868 [368/707] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_uio.c.o 00:01:57.868 [369/707] Generating lib/gpudev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:57.868 [370/707] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_pipeline.c.o 00:01:57.868 [371/707] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_hmc.c.o 00:01:57.868 [372/707] Compiling C object lib/librte_table.a.p/table_rte_table_hash_key8.c.o 00:01:57.868 [373/707] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev.c.o 00:01:57.868 [374/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_evt_test.c.o 00:01:57.868 [375/707] Linking static target drivers/libtmp_rte_bus_vdev.a 00:01:57.868 [376/707] Compiling C object lib/librte_node.a.p/node_ip4_local.c.o 00:01:57.868 [377/707] Generating lib/rib.sym_chk with a custom command (wrapped by meson to capture output) 00:01:57.868 [378/707] Compiling C object lib/librte_vhost.a.p/vhost_vhost.c.o 00:01:58.134 [379/707] Compiling C object lib/librte_acl.a.p/acl_acl_run_sse.c.o 00:01:58.134 [380/707] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common.c.o 00:01:58.134 [381/707] Compiling C object lib/librte_graph.a.p/graph_rte_graph_model_mcore_dispatch.c.o 00:01:58.134 [382/707] Linking static target lib/librte_graph.a 00:01:58.134 [383/707] Compiling C object lib/librte_vhost.a.p/vhost_vhost_user.c.o 00:01:58.134 [384/707] Compiling C object app/dpdk-graph.p/graph_cli.c.o 00:01:58.134 [385/707] Compiling C object lib/librte_pdump.a.p/pdump_rte_pdump.c.o 00:01:58.134 [386/707] Linking static target lib/librte_pdump.a 00:01:58.134 [387/707] Compiling C object lib/librte_table.a.p/table_rte_table_hash_ext.c.o 00:01:58.134 [388/707] Compiling C object lib/librte_table.a.p/table_rte_table_hash_key16.c.o 00:01:58.134 [389/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_parser.c.o 00:01:58.134 [390/707] Compiling C object app/dpdk-graph.p/graph_ip4_route.c.o 00:01:58.134 [391/707] Compiling C object app/dpdk-graph.p/graph_ip6_route.c.o 00:01:58.134 [392/707] Compiling C object app/dpdk-graph.p/graph_mempool.c.o 00:01:58.134 [393/707] Compiling C object app/dpdk-graph.p/graph_conn.c.o 00:01:58.134 [394/707] Compiling C object app/dpdk-graph.p/graph_main.c.o 00:01:58.134 [395/707] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_vfio.c.o 00:01:58.134 [396/707] Compiling C object app/dpdk-graph.p/graph_utils.c.o 00:01:58.134 [397/707] Linking static target drivers/libtmp_rte_bus_pci.a 00:01:58.134 [398/707] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_main.c.o 00:01:58.134 [399/707] Compiling C object lib/librte_node.a.p/node_udp4_input.c.o 00:01:58.134 [400/707] Compiling C object app/dpdk-graph.p/graph_ethdev_rx.c.o 00:01:58.134 [401/707] Compiling C object lib/librte_table.a.p/table_rte_table_hash_lru.c.o 00:01:58.134 [402/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_ml_test.c.o 00:01:58.134 [403/707] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_dcb.c.o 00:01:58.134 [404/707] Generating drivers/rte_bus_vdev.pmd.c with a custom command 00:01:58.134 [405/707] Compiling C object app/dpdk-graph.p/graph_l3fwd.c.o 00:01:58.134 [406/707] Compiling C object lib/librte_node.a.p/node_pkt_cls.c.o 00:01:58.396 [407/707] Compiling C object drivers/librte_bus_vdev.a.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:01:58.396 [408/707] Linking static target drivers/librte_bus_vdev.a 00:01:58.396 [409/707] Compiling C object drivers/librte_bus_vdev.so.24.0.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:01:58.396 [410/707] Generating lib/fib.sym_chk with a custom command (wrapped by meson to capture output) 00:01:58.396 [411/707] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_lan_hmc.c.o 00:01:58.396 [412/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_vf_representor.c.o 00:01:58.396 [413/707] Compiling C object app/dpdk-graph.p/graph_neigh.c.o 00:01:58.396 [414/707] Compiling C object lib/librte_sched.a.p/sched_rte_sched.c.o 00:01:58.396 [415/707] Compiling C object lib/librte_table.a.p/table_rte_table_hash_key32.c.o 00:01:58.396 [416/707] Compiling C object lib/librte_node.a.p/node_kernel_rx.c.o 00:01:58.396 [417/707] Compiling C object app/dpdk-graph.p/graph_graph.c.o 00:01:58.396 [418/707] Linking static target lib/librte_sched.a 00:01:58.396 [419/707] Compiling C object lib/librte_node.a.p/node_ip6_lookup.c.o 00:01:58.396 [420/707] Linking static target lib/librte_table.a 00:01:58.396 [421/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_common.c.o 00:01:58.396 [422/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_parser.c.o 00:01:58.396 [423/707] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_adminq.c.o 00:01:58.396 [424/707] Compiling C object app/dpdk-graph.p/graph_ethdev.c.o 00:01:58.396 [425/707] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_options_parse.c.o 00:01:58.396 [426/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_tm.c.o 00:01:58.396 [427/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_vectors.c.o 00:01:58.396 [428/707] Compiling C object lib/librte_cryptodev.a.p/cryptodev_rte_cryptodev.c.o 00:01:58.396 [429/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_recycle_mbufs_vec_common.c.o 00:01:58.396 [430/707] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_main.c.o 00:01:58.396 [431/707] Linking static target lib/librte_cryptodev.a 00:01:58.396 [432/707] Generating lib/pdump.sym_chk with a custom command (wrapped by meson to capture output) 00:01:58.660 [433/707] Generating drivers/rte_bus_pci.pmd.c with a custom command 00:01:58.660 [434/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_hash.c.o 00:01:58.660 [435/707] Compiling C object lib/librte_node.a.p/node_ip4_lookup.c.o 00:01:58.660 [436/707] Compiling C object drivers/librte_bus_pci.a.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:01:58.660 [437/707] Compiling C object app/dpdk-test-acl.p/test-acl_main.c.o 00:01:58.660 [438/707] Linking static target drivers/librte_bus_pci.a 00:01:58.660 [439/707] Compiling C object drivers/librte_bus_pci.so.24.0.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:01:58.660 [440/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_evt_main.c.o 00:01:58.660 [441/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_vector_parsing.c.o 00:01:58.660 [442/707] Compiling C object lib/librte_ipsec.a.p/ipsec_esp_outb.c.o 00:01:58.660 [443/707] Compiling C object app/dpdk-test-dma-perf.p/test-dma-perf_main.c.o 00:01:58.660 [444/707] Generating drivers/rte_bus_vdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:58.660 [445/707] Compiling C object app/dpdk-dumpcap.p/dumpcap_main.c.o 00:01:58.660 [446/707] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_flow_gen.c.o 00:01:58.660 [447/707] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_ipsec.c.o 00:01:58.660 [448/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_ml_main.c.o 00:01:58.660 [449/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_evt_options.c.o 00:01:58.660 [450/707] Generating lib/mldev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:58.660 [451/707] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_items_gen.c.o 00:01:58.660 [452/707] Compiling C object drivers/libtmp_rte_mempool_ring.a.p/mempool_ring_rte_mempool_ring.c.o 00:01:58.660 [453/707] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_nvm.c.o 00:01:58.660 [454/707] Linking static target drivers/libtmp_rte_mempool_ring.a 00:01:58.660 [455/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_model_common.c.o 00:01:58.920 [456/707] Compiling C object lib/librte_ipsec.a.p/ipsec_esp_inb.c.o 00:01:58.920 [457/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_inference_ordered.c.o 00:01:58.920 [458/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_ops.c.o 00:01:58.920 [459/707] Linking static target lib/librte_ipsec.a 00:01:58.920 [460/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_device_ops.c.o 00:01:58.920 [461/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_stats.c.o 00:01:58.920 [462/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_order_common.c.o 00:01:58.920 [463/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_inference_interleave.c.o 00:01:58.920 [464/707] Compiling C object app/dpdk-test-gpudev.p/test-gpudev_main.c.o 00:01:58.920 [465/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_main.c.o 00:01:58.920 [466/707] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_throughput.c.o 00:01:58.920 [467/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_options_parsing.c.o 00:01:58.920 [468/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_common.c.o 00:01:58.920 [469/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_ml_options.c.o 00:01:58.920 [470/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_pf.c.o 00:01:58.920 [471/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_model_ops.c.o 00:01:58.920 [472/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_throughput.c.o 00:01:58.920 [473/707] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_config.c.o 00:01:58.920 [474/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_order_atq.c.o 00:01:58.920 [475/707] Compiling C object lib/librte_member.a.p/member_rte_member_sketch.c.o 00:01:58.920 [476/707] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_stub.c.o 00:01:58.920 [477/707] Linking static target lib/librte_member.a 00:01:58.920 [478/707] Generating lib/graph.sym_chk with a custom command (wrapped by meson to capture output) 00:01:58.920 [479/707] Compiling C object lib/librte_node.a.p/node_ip6_rewrite.c.o 00:01:58.920 [480/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_order_queue.c.o 00:01:58.920 [481/707] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_main.c.o 00:01:58.920 [482/707] Generating lib/sched.sym_chk with a custom command (wrapped by meson to capture output) 00:01:58.920 [483/707] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_acl.c.o 00:01:58.920 [484/707] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_init.c.o 00:01:58.920 [485/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_verify.c.o 00:01:58.920 [486/707] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_actions_gen.c.o 00:01:58.920 [487/707] Compiling C object lib/librte_node.a.p/node_ip4_rewrite.c.o 00:01:58.920 [488/707] Compiling C object lib/librte_pdcp.a.p/pdcp_pdcp_process.c.o 00:01:58.920 [489/707] Generating drivers/rte_mempool_ring.pmd.c with a custom command 00:01:58.920 [490/707] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_lpm.c.o 00:01:58.920 [491/707] Linking static target lib/librte_node.a 00:01:58.920 [492/707] Linking static target lib/librte_pdcp.a 00:01:59.180 [493/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_pmd_cyclecount.c.o 00:01:59.180 [494/707] Compiling C object drivers/librte_mempool_ring.so.24.0.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:01:59.180 [495/707] Compiling C object drivers/librte_mempool_ring.a.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:01:59.180 [496/707] Linking static target drivers/librte_mempool_ring.a 00:01:59.180 [497/707] Compiling C object lib/librte_hash.a.p/hash_rte_cuckoo_hash.c.o 00:01:59.180 [498/707] Linking static target lib/librte_hash.a 00:01:59.180 [499/707] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_ctl.c.o 00:01:59.180 [500/707] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_lpm_ipv6.c.o 00:01:59.180 [501/707] Compiling C object app/dpdk-test-dma-perf.p/test-dma-perf_benchmark.c.o 00:01:59.180 [502/707] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_verify.c.o 00:01:59.180 [503/707] Compiling C object app/dpdk-pdump.p/pdump_main.c.o 00:01:59.180 [504/707] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_cman.c.o 00:01:59.180 [505/707] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_test_bbdev.c.o 00:01:59.180 [506/707] Compiling C object app/dpdk-testpmd.p/test-pmd_5tswap.c.o 00:01:59.180 [507/707] Compiling C object app/dpdk-testpmd.p/test-pmd_cmd_flex_item.c.o 00:01:59.180 [508/707] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_latency.c.o 00:01:59.180 [509/707] Compiling C object lib/librte_port.a.p/port_rte_port_ring.c.o 00:01:59.180 [510/707] Compiling C object app/dpdk-proc-info.p/proc-info_main.c.o 00:01:59.180 [511/707] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_hash.c.o 00:01:59.180 [512/707] Compiling C object app/dpdk-testpmd.p/test-pmd_iofwd.c.o 00:01:59.180 [513/707] Generating lib/ipsec.sym_chk with a custom command (wrapped by meson to capture output) 00:01:59.180 [514/707] Generating drivers/rte_bus_pci.sym_chk with a custom command (wrapped by meson to capture output) 00:01:59.180 [515/707] Linking static target lib/librte_port.a 00:01:59.180 [516/707] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_common.c.o 00:01:59.180 [517/707] Compiling C object lib/acl/libavx2_tmp.a.p/acl_run_avx2.c.o 00:01:59.180 [518/707] Linking static target lib/acl/libavx2_tmp.a 00:01:59.180 [519/707] Generating lib/member.sym_chk with a custom command (wrapped by meson to capture output) 00:01:59.180 [520/707] Compiling C object app/dpdk-testpmd.p/test-pmd_flowgen.c.o 00:01:59.180 [521/707] Generating lib/table.sym_chk with a custom command (wrapped by meson to capture output) 00:01:59.438 [522/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_pipeline_atq.c.o 00:01:59.438 [523/707] Compiling C object app/dpdk-testpmd.p/test-pmd_macfwd.c.o 00:01:59.438 [524/707] Compiling C object app/dpdk-testpmd.p/test-pmd_macswap.c.o 00:01:59.438 [525/707] Compiling C object app/dpdk-testpmd.p/test-pmd_rxonly.c.o 00:01:59.438 [526/707] Compiling C object app/dpdk-test-security-perf.p/test-security-perf_test_security_perf.c.o 00:01:59.438 [527/707] Compiling C object app/dpdk-testpmd.p/test-pmd_icmpecho.c.o 00:01:59.438 [528/707] Generating lib/node.sym_chk with a custom command (wrapped by meson to capture output) 00:01:59.438 [529/707] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_test_bbdev_vector.c.o 00:01:59.438 [530/707] Compiling C object app/dpdk-testpmd.p/test-pmd_recycle_mbufs.c.o 00:01:59.438 [531/707] Compiling C object app/dpdk-testpmd.p/test-pmd_shared_rxq_fwd.c.o 00:01:59.438 [532/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_pipeline_common.c.o 00:01:59.438 [533/707] Generating lib/pdcp.sym_chk with a custom command (wrapped by meson to capture output) 00:01:59.438 [534/707] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_runtime.c.o 00:01:59.438 [535/707] Compiling C object app/dpdk-test-sad.p/test-sad_main.c.o 00:01:59.439 [536/707] Compiling C object app/dpdk-testpmd.p/test-pmd_bpf_cmd.c.o 00:01:59.439 [537/707] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_cyclecount.c.o 00:01:59.439 [538/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_flow.c.o 00:01:59.439 [539/707] Compiling C object app/dpdk-test-fib.p/test-fib_main.c.o 00:01:59.439 [540/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_pipeline_queue.c.o 00:01:59.439 [541/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_fdir.c.o 00:01:59.439 [542/707] Compiling C object app/dpdk-testpmd.p/test-pmd_ieee1588fwd.c.o 00:01:59.439 [543/707] Compiling C object lib/librte_acl.a.p/acl_acl_run_avx512.c.o 00:01:59.439 [544/707] Compiling C object app/dpdk-testpmd.p/test-pmd_util.c.o 00:01:59.439 [545/707] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_eth_rx_adapter.c.o 00:01:59.439 [546/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_rxtx_vec_sse.c.o 00:01:59.439 [547/707] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_mtr.c.o 00:01:59.439 [548/707] Linking static target lib/librte_acl.a 00:01:59.439 [549/707] Linking static target lib/librte_eventdev.a 00:01:59.439 [550/707] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_tm.c.o 00:01:59.439 [551/707] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_main.c.o 00:01:59.698 [552/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_rte_pmd_i40e.c.o 00:01:59.698 [553/707] Compiling C object drivers/net/i40e/libi40e_avx2_lib.a.p/i40e_rxtx_vec_avx2.c.o 00:01:59.698 [554/707] Compiling C object app/dpdk-testpmd.p/test-pmd_parameters.c.o 00:01:59.698 [555/707] Linking static target drivers/net/i40e/libi40e_avx2_lib.a 00:01:59.698 [556/707] Compiling C object app/dpdk-testpmd.p/.._drivers_net_i40e_i40e_testpmd.c.o 00:01:59.698 [557/707] Compiling C object app/dpdk-test-regex.p/test-regex_main.c.o 00:01:59.698 [558/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_perf_atq.c.o 00:01:59.698 [559/707] Generating lib/hash.sym_chk with a custom command (wrapped by meson to capture output) 00:01:59.698 [560/707] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_common.c.o 00:01:59.957 [561/707] Compiling C object drivers/net/i40e/libi40e_avx512_lib.a.p/i40e_rxtx_vec_avx512.c.o 00:01:59.957 [562/707] Linking static target drivers/net/i40e/libi40e_avx512_lib.a 00:01:59.957 [563/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_perf_queue.c.o 00:01:59.957 [564/707] Linking static target drivers/net/i40e/base/libi40e_base.a 00:01:59.957 [565/707] Compiling C object app/dpdk-test-security-perf.p/test_test_cryptodev_security_ipsec.c.o 00:01:59.957 [566/707] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_pipeline_spec.c.o 00:01:59.957 [567/707] Generating lib/acl.sym_chk with a custom command (wrapped by meson to capture output) 00:01:59.957 [568/707] Compiling C object app/dpdk-testpmd.p/test-pmd_txonly.c.o 00:01:59.957 [569/707] Generating lib/port.sym_chk with a custom command (wrapped by meson to capture output) 00:02:00.216 [570/707] Compiling C object app/dpdk-testpmd.p/test-pmd_csumonly.c.o 00:02:00.216 [571/707] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_inference_common.c.o 00:02:00.216 [572/707] Generating lib/cryptodev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:00.216 [573/707] Compiling C object app/dpdk-testpmd.p/test-pmd_testpmd.c.o 00:02:00.474 [574/707] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev.c.o 00:02:00.474 [575/707] Linking static target lib/librte_ethdev.a 00:02:00.474 [576/707] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline.c.o 00:02:00.733 [577/707] Compiling C object app/dpdk-testpmd.p/test-pmd_noisy_vnf.c.o 00:02:00.992 [578/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_rxtx.c.o 00:02:00.992 [579/707] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_perf_common.c.o 00:02:00.992 [580/707] Compiling C object app/dpdk-testpmd.p/test-pmd_config.c.o 00:02:01.559 [581/707] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_ethdev.c.o 00:02:01.559 [582/707] Linking static target drivers/libtmp_rte_net_i40e.a 00:02:01.817 [583/707] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_flow.c.o 00:02:01.817 [584/707] Generating drivers/rte_net_i40e.pmd.c with a custom command 00:02:01.817 [585/707] Compiling C object drivers/librte_net_i40e.a.p/meson-generated_.._rte_net_i40e.pmd.c.o 00:02:01.817 [586/707] Compiling C object drivers/librte_net_i40e.so.24.0.p/meson-generated_.._rte_net_i40e.pmd.c.o 00:02:02.076 [587/707] Linking static target drivers/librte_net_i40e.a 00:02:02.334 [588/707] Compiling C object lib/librte_vhost.a.p/vhost_vhost_crypto.c.o 00:02:02.901 [589/707] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_pipeline.c.o 00:02:02.901 [590/707] Generating lib/eventdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:02.901 [591/707] Generating drivers/rte_net_i40e.sym_chk with a custom command (wrapped by meson to capture output) 00:02:03.469 [592/707] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_test_bbdev_perf.c.o 00:02:08.739 [593/707] Generating lib/eal.sym_chk with a custom command (wrapped by meson to capture output) 00:02:08.739 [594/707] Linking target lib/librte_eal.so.24.0 00:02:08.739 [595/707] Generating symbol file lib/librte_eal.so.24.0.p/librte_eal.so.24.0.symbols 00:02:08.739 [596/707] Linking target lib/librte_pci.so.24.0 00:02:08.739 [597/707] Linking target lib/librte_dmadev.so.24.0 00:02:08.739 [598/707] Linking target lib/librte_meter.so.24.0 00:02:08.739 [599/707] Linking target lib/librte_ring.so.24.0 00:02:08.739 [600/707] Linking target lib/librte_timer.so.24.0 00:02:08.739 [601/707] Linking target lib/librte_cfgfile.so.24.0 00:02:08.739 [602/707] Linking target lib/librte_jobstats.so.24.0 00:02:08.739 [603/707] Linking target lib/librte_stack.so.24.0 00:02:08.739 [604/707] Linking target lib/librte_rawdev.so.24.0 00:02:08.739 [605/707] Linking target drivers/librte_bus_vdev.so.24.0 00:02:08.739 [606/707] Linking target lib/librte_acl.so.24.0 00:02:08.739 [607/707] Generating symbol file lib/librte_pci.so.24.0.p/librte_pci.so.24.0.symbols 00:02:08.739 [608/707] Generating symbol file lib/librte_acl.so.24.0.p/librte_acl.so.24.0.symbols 00:02:08.739 [609/707] Generating symbol file lib/librte_dmadev.so.24.0.p/librte_dmadev.so.24.0.symbols 00:02:08.739 [610/707] Generating symbol file lib/librte_timer.so.24.0.p/librte_timer.so.24.0.symbols 00:02:08.739 [611/707] Generating symbol file lib/librte_meter.so.24.0.p/librte_meter.so.24.0.symbols 00:02:08.739 [612/707] Generating symbol file lib/librte_ring.so.24.0.p/librte_ring.so.24.0.symbols 00:02:08.739 [613/707] Generating symbol file drivers/librte_bus_vdev.so.24.0.p/librte_bus_vdev.so.24.0.symbols 00:02:08.739 [614/707] Linking target drivers/librte_bus_pci.so.24.0 00:02:08.739 [615/707] Linking target lib/librte_mempool.so.24.0 00:02:08.739 [616/707] Linking target lib/librte_rcu.so.24.0 00:02:08.739 [617/707] Generating symbol file drivers/librte_bus_pci.so.24.0.p/librte_bus_pci.so.24.0.symbols 00:02:08.739 [618/707] Generating symbol file lib/librte_rcu.so.24.0.p/librte_rcu.so.24.0.symbols 00:02:08.739 [619/707] Generating symbol file lib/librte_mempool.so.24.0.p/librte_mempool.so.24.0.symbols 00:02:08.739 [620/707] Linking target lib/librte_rib.so.24.0 00:02:08.739 [621/707] Linking target drivers/librte_mempool_ring.so.24.0 00:02:08.739 [622/707] Linking target lib/librte_mbuf.so.24.0 00:02:08.998 [623/707] Generating symbol file lib/librte_mbuf.so.24.0.p/librte_mbuf.so.24.0.symbols 00:02:08.998 [624/707] Generating symbol file lib/librte_rib.so.24.0.p/librte_rib.so.24.0.symbols 00:02:08.998 [625/707] Linking target lib/librte_net.so.24.0 00:02:08.998 [626/707] Linking target lib/librte_bbdev.so.24.0 00:02:08.998 [627/707] Linking target lib/librte_reorder.so.24.0 00:02:08.998 [628/707] Linking target lib/librte_gpudev.so.24.0 00:02:08.998 [629/707] Linking target lib/librte_regexdev.so.24.0 00:02:08.998 [630/707] Linking target lib/librte_distributor.so.24.0 00:02:08.998 [631/707] Linking target lib/librte_compressdev.so.24.0 00:02:08.998 [632/707] Linking target lib/librte_mldev.so.24.0 00:02:08.998 [633/707] Linking target lib/librte_sched.so.24.0 00:02:08.998 [634/707] Linking target lib/librte_cryptodev.so.24.0 00:02:08.998 [635/707] Linking target lib/librte_fib.so.24.0 00:02:08.998 [636/707] Generating symbol file lib/librte_net.so.24.0.p/librte_net.so.24.0.symbols 00:02:08.998 [637/707] Generating symbol file lib/librte_reorder.so.24.0.p/librte_reorder.so.24.0.symbols 00:02:08.998 [638/707] Generating symbol file lib/librte_cryptodev.so.24.0.p/librte_cryptodev.so.24.0.symbols 00:02:08.998 [639/707] Generating symbol file lib/librte_sched.so.24.0.p/librte_sched.so.24.0.symbols 00:02:08.998 [640/707] Linking target lib/librte_hash.so.24.0 00:02:08.998 [641/707] Linking target lib/librte_cmdline.so.24.0 00:02:08.998 [642/707] Linking target lib/librte_security.so.24.0 00:02:09.256 [643/707] Generating lib/ethdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:09.256 [644/707] Generating symbol file lib/librte_hash.so.24.0.p/librte_hash.so.24.0.symbols 00:02:09.256 [645/707] Linking target lib/librte_ethdev.so.24.0 00:02:09.256 [646/707] Generating symbol file lib/librte_security.so.24.0.p/librte_security.so.24.0.symbols 00:02:09.256 [647/707] Linking target lib/librte_efd.so.24.0 00:02:09.256 [648/707] Linking target lib/librte_lpm.so.24.0 00:02:09.256 [649/707] Linking target lib/librte_member.so.24.0 00:02:09.256 [650/707] Linking target lib/librte_ipsec.so.24.0 00:02:09.256 [651/707] Linking target lib/librte_pdcp.so.24.0 00:02:09.256 [652/707] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_table_action.c.o 00:02:09.256 [653/707] Linking static target lib/librte_pipeline.a 00:02:09.256 [654/707] Generating symbol file lib/librte_ethdev.so.24.0.p/librte_ethdev.so.24.0.symbols 00:02:09.516 [655/707] Generating symbol file lib/librte_lpm.so.24.0.p/librte_lpm.so.24.0.symbols 00:02:09.516 [656/707] Linking target lib/librte_pcapng.so.24.0 00:02:09.516 [657/707] Linking target lib/librte_bpf.so.24.0 00:02:09.516 [658/707] Linking target lib/librte_metrics.so.24.0 00:02:09.516 [659/707] Linking target lib/librte_gro.so.24.0 00:02:09.516 [660/707] Linking target lib/librte_power.so.24.0 00:02:09.516 [661/707] Generating symbol file lib/librte_ipsec.so.24.0.p/librte_ipsec.so.24.0.symbols 00:02:09.516 [662/707] Linking target lib/librte_gso.so.24.0 00:02:09.516 [663/707] Linking target lib/librte_ip_frag.so.24.0 00:02:09.516 [664/707] Linking target lib/librte_eventdev.so.24.0 00:02:09.516 [665/707] Linking target drivers/librte_net_i40e.so.24.0 00:02:09.516 [666/707] Generating symbol file lib/librte_pcapng.so.24.0.p/librte_pcapng.so.24.0.symbols 00:02:09.516 [667/707] Generating symbol file lib/librte_bpf.so.24.0.p/librte_bpf.so.24.0.symbols 00:02:09.516 [668/707] Generating symbol file lib/librte_metrics.so.24.0.p/librte_metrics.so.24.0.symbols 00:02:09.516 [669/707] Generating symbol file lib/librte_ip_frag.so.24.0.p/librte_ip_frag.so.24.0.symbols 00:02:09.516 [670/707] Generating symbol file lib/librte_eventdev.so.24.0.p/librte_eventdev.so.24.0.symbols 00:02:09.516 [671/707] Linking target lib/librte_pdump.so.24.0 00:02:09.516 [672/707] Linking target lib/librte_graph.so.24.0 00:02:09.516 [673/707] Linking target lib/librte_latencystats.so.24.0 00:02:09.516 [674/707] Linking target lib/librte_bitratestats.so.24.0 00:02:09.516 [675/707] Linking target lib/librte_dispatcher.so.24.0 00:02:09.516 [676/707] Linking target lib/librte_port.so.24.0 00:02:09.775 [677/707] Generating symbol file lib/librte_graph.so.24.0.p/librte_graph.so.24.0.symbols 00:02:09.775 [678/707] Generating symbol file lib/librte_port.so.24.0.p/librte_port.so.24.0.symbols 00:02:09.775 [679/707] Linking target lib/librte_node.so.24.0 00:02:09.775 [680/707] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net.c.o 00:02:09.775 [681/707] Linking target lib/librte_table.so.24.0 00:02:09.775 [682/707] Linking static target lib/librte_vhost.a 00:02:10.034 [683/707] Generating symbol file lib/librte_table.so.24.0.p/librte_table.so.24.0.symbols 00:02:10.294 [684/707] Linking target app/dpdk-pdump 00:02:10.294 [685/707] Linking target app/dpdk-test-acl 00:02:10.294 [686/707] Linking target app/dpdk-test-dma-perf 00:02:10.294 [687/707] Linking target app/dpdk-test-sad 00:02:10.294 [688/707] Linking target app/dpdk-test-cmdline 00:02:10.294 [689/707] Linking target app/dpdk-test-crypto-perf 00:02:10.294 [690/707] Linking target app/dpdk-test-gpudev 00:02:10.294 [691/707] Linking target app/dpdk-dumpcap 00:02:10.294 [692/707] Linking target app/dpdk-test-flow-perf 00:02:10.294 [693/707] Linking target app/dpdk-graph 00:02:10.294 [694/707] Linking target app/dpdk-test-compress-perf 00:02:10.294 [695/707] Linking target app/dpdk-proc-info 00:02:10.294 [696/707] Linking target app/dpdk-test-security-perf 00:02:10.294 [697/707] Linking target app/dpdk-test-mldev 00:02:10.294 [698/707] Linking target app/dpdk-test-regex 00:02:10.294 [699/707] Linking target app/dpdk-test-pipeline 00:02:10.294 [700/707] Linking target app/dpdk-test-eventdev 00:02:10.294 [701/707] Linking target app/dpdk-test-fib 00:02:10.294 [702/707] Linking target app/dpdk-test-bbdev 00:02:10.294 [703/707] Linking target app/dpdk-testpmd 00:02:12.200 [704/707] Generating lib/vhost.sym_chk with a custom command (wrapped by meson to capture output) 00:02:12.200 [705/707] Linking target lib/librte_vhost.so.24.0 00:02:15.488 [706/707] Generating lib/pipeline.sym_chk with a custom command (wrapped by meson to capture output) 00:02:15.488 [707/707] Linking target lib/librte_pipeline.so.24.0 00:02:15.488 10:33:31 -- common/autobuild_common.sh@187 -- $ ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp -j112 install 00:02:15.488 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp' 00:02:15.488 [0/1] Installing files. 00:02:15.488 Installing subdir /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples 00:02:15.488 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:15.488 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:15.489 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/em_default_v4.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:15.489 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:15.489 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_hlm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:15.489 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_route.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:15.489 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:15.489 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_acl.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:15.489 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:15.489 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:15.489 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_acl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:15.489 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:15.489 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_hlm_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:15.489 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/lpm_route_parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:15.489 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_altivec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:15.489 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:15.489 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm_altivec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:15.489 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_fib.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:15.489 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event_generic.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:15.489 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/lpm_default_v6.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:15.489 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event_internal_port.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:15.489 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_sequential.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:15.489 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:15.489 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_hlm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:15.489 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:15.489 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:15.489 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/em_route_parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:15.489 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/lpm_default_v4.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:15.489 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:15.489 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:15.489 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/em_default_v6.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:15.489 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_acl_scalar.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:15.489 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:15.489 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq_dcb/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq_dcb 00:02:15.489 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq_dcb/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq_dcb 00:02:15.489 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_filtering/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering 00:02:15.489 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_filtering/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering 00:02:15.489 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_filtering/flow_blocks.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering 00:02:15.489 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event_internal_port.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:15.489 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:15.489 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:15.489 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:15.489 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event_generic.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:15.489 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:15.489 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:15.489 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_poll.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:15.489 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_common.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:15.489 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_poll.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:15.489 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/commands.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:15.489 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/parse_obj_list.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:15.489 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/commands.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:15.489 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:15.489 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:15.489 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/parse_obj_list.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:15.489 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/pkt_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common 00:02:15.489 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/neon/port_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common/neon 00:02:15.489 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/altivec/port_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common/altivec 00:02:15.489 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/sse/port_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common/sse 00:02:15.489 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ptpclient/ptpclient.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ptpclient 00:02:15.489 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ptpclient/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ptpclient 00:02:15.489 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/helloworld/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/helloworld 00:02:15.489 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/helloworld/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/helloworld 00:02:15.489 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd 00:02:15.489 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd 00:02:15.489 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/rxtx_callbacks/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/rxtx_callbacks 00:02:15.489 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/rxtx_callbacks/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/rxtx_callbacks 00:02:15.489 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/oob_monitor_x86.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:15.489 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/vm_power_cli.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:15.489 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_manager.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:15.489 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_monitor.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:15.489 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:15.489 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:15.489 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_monitor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:15.489 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/oob_monitor_nop.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:15.489 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:15.489 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/parse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:15.489 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_manager.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:15.489 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/vm_power_cli.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:15.490 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/power_manager.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:15.490 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/oob_monitor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:15.490 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/power_manager.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:15.490 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:15.490 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:15.490 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:15.490 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/parse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:15.490 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/vm_power_cli_guest.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:15.490 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/vm_power_cli_guest.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:15.490 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/app_thread.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:15.490 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile_ov.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:15.490 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/args.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:15.490 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/cfg_file.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:15.490 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/init.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:15.490 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/cmdline.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:15.490 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:15.490 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/cfg_file.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:15.490 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/stats.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:15.490 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:15.490 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:15.490 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile_red.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:15.490 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile_pie.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:15.490 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:15.490 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/l2fwd-cat.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:02:15.490 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/cat.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:02:15.490 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:02:15.490 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/cat.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:02:15.490 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/perf_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:15.490 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:15.490 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:15.490 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:15.490 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/perf_core.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:15.490 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vdpa/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:02:15.490 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vdpa/commands.list to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:02:15.490 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vdpa/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:02:15.490 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vdpa/vdpa_blk_compact.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:02:15.490 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/virtio_net.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:02:15.490 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:02:15.490 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:02:15.490 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:02:15.490 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/blk_spec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:15.490 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/vhost_blk.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:15.490 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/vhost_blk_compat.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:15.490 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/vhost_blk.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:15.490 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:15.490 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/blk.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:15.490 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_ecdsa.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:15.490 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_aes.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:15.490 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_sha.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:15.490 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_tdes.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:15.490 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:15.490 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:15.490 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_dev_self_test.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:15.490 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_rsa.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:15.490 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:15.490 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_dev_self_test.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:15.490 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_gcm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:15.490 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_cmac.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:15.490 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_xts.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:15.490 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_hmac.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:15.490 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_ccm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:15.490 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:15.490 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bond/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bond 00:02:15.490 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bond/commands.list to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bond 00:02:15.490 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bond/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bond 00:02:15.490 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/dma/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/dma 00:02:15.490 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/dma/dmafwd.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/dma 00:02:15.490 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process 00:02:15.490 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/commands.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:02:15.490 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:02:15.490 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/commands.list to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:02:15.490 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:02:15.490 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/mp_commands.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:02:15.490 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:02:15.491 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/commands.list to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:02:15.491 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:02:15.491 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/mp_commands.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:02:15.491 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/symmetric_mp/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/symmetric_mp 00:02:15.491 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/symmetric_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/symmetric_mp 00:02:15.491 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp 00:02:15.491 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/args.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:15.491 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/args.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:15.491 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/init.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:15.491 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:15.491 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:15.491 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/init.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:15.491 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_client/client.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_client 00:02:15.491 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_client/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_client 00:02:15.491 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/shared/common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/shared 00:02:15.491 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-graph/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-graph 00:02:15.491 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-graph/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-graph 00:02:15.491 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-jobstats/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-jobstats 00:02:15.491 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-jobstats/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-jobstats 00:02:15.491 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_fragmentation/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_fragmentation 00:02:15.491 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_fragmentation/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_fragmentation 00:02:15.491 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/packet_ordering/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/packet_ordering 00:02:15.491 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/packet_ordering/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/packet_ordering 00:02:15.491 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-crypto/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-crypto 00:02:15.491 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-crypto/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-crypto 00:02:15.491 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/shm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:02:15.491 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:02:15.491 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:02:15.491 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/shm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:02:15.491 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/ka-agent/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive/ka-agent 00:02:15.491 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/ka-agent/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive/ka-agent 00:02:15.491 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/skeleton/basicfwd.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/skeleton 00:02:15.491 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/skeleton/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/skeleton 00:02:15.491 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-macsec/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-macsec 00:02:15.491 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-macsec/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-macsec 00:02:15.491 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/service_cores/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/service_cores 00:02:15.491 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/service_cores/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/service_cores 00:02:15.491 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/distributor/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/distributor 00:02:15.491 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/distributor/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/distributor 00:02:15.491 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/pipeline_worker_tx.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:15.491 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/pipeline_worker_generic.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:15.491 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:15.491 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:15.491 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/pipeline_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:15.491 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/esp.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:15.491 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/event_helper.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:15.491 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_worker.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:15.491 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:15.491 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipip.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:15.491 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ep1.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:15.491 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sp4.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:15.491 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:15.491 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/flow.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:15.491 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_process.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:15.491 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/flow.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:15.491 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:15.491 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/rt.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:15.491 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_worker.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:15.491 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_lpm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:15.491 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sa.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:15.491 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/event_helper.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:15.491 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sad.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:15.491 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:15.491 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sad.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:15.491 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/esp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:15.491 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec-secgw.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:15.491 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sp6.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:15.491 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/parser.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:15.491 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ep0.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:15.491 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/parser.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:15.491 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec-secgw.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:15.491 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:15.491 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_3descbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:15.492 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_3descbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:15.492 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aescbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:15.492 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/load_env.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:15.492 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesctr_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:15.492 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_null_header_reconstruct.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:15.492 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/run_test.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:15.492 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesgcm_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:15.492 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/common_defs_secgw.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:15.492 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_3descbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:15.492 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesctr_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:15.492 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesctr_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:15.492 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/data_rxtx.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:15.492 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/bypass_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:15.492 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_ipv6opts.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:15.492 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesgcm_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:15.492 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesgcm_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:15.492 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesctr_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:15.492 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/pkttest.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:15.492 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesgcm_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:15.492 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aescbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:15.492 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/pkttest.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:15.492 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_3descbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:15.492 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/linux_test.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:15.492 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aescbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:15.492 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aescbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:15.492 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipv4_multicast/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipv4_multicast 00:02:15.492 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipv4_multicast/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipv4_multicast 00:02:15.492 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd 00:02:15.492 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_node/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_node 00:02:15.492 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_node/node.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_node 00:02:15.492 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_server/args.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_server 00:02:15.492 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_server/args.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_server 00:02:15.492 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_server/init.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_server 00:02:15.492 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_server/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_server 00:02:15.492 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_server/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_server 00:02:15.492 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_server/init.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_server 00:02:15.492 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/shared/common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/shared 00:02:15.492 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/thread.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:15.492 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cli.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:15.492 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/action.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:15.492 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tap.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:15.492 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tmgr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:15.492 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/swq.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:15.492 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/thread.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:15.492 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/pipeline.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:15.492 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/swq.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:15.492 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/action.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:15.492 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/conn.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:15.492 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tap.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:15.492 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/link.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:15.492 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:15.492 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cryptodev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:15.492 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/mempool.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:15.492 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:15.492 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/conn.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:15.492 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/parser.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:15.492 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/link.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:15.492 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cryptodev.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:15.492 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:15.492 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/parser.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:15.492 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cli.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:15.492 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/pipeline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:15.492 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/mempool.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:15.492 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tmgr.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:15.492 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/route_ecmp.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:15.492 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/rss.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:15.492 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/flow.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:15.492 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/firewall.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:15.492 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/tap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:15.493 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/l2fwd.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:15.493 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/route.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:15.493 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/flow_crypto.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:15.493 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/t1.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:02:15.493 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/t3.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:02:15.493 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/README to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:02:15.493 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/dummy.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:02:15.493 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/t2.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:02:15.493 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq 00:02:15.493 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq 00:02:15.493 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/link_status_interrupt/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/link_status_interrupt 00:02:15.493 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/link_status_interrupt/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/link_status_interrupt 00:02:15.493 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_reassembly/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_reassembly 00:02:15.493 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_reassembly/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_reassembly 00:02:15.493 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bbdev_app/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bbdev_app 00:02:15.493 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bbdev_app/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bbdev_app 00:02:15.493 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/thread.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:15.493 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/cli.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:15.493 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/thread.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:15.493 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/conn.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:15.493 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:15.493 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/obj.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:15.493 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:15.493 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/conn.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:15.493 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/cli.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:15.493 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/obj.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:15.493 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/rss.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:15.493 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/hash_func.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:15.493 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_macswp.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:15.493 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/mirroring.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:15.493 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/ipsec.io to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:15.493 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/selector.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:15.493 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/ethdev.io to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:15.493 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/mirroring.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:15.493 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/varbit.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:15.493 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:15.493 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:15.493 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/recirculation.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:15.493 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/recirculation.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:15.493 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/rss.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:15.493 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/ipsec.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:15.493 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/varbit.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:15.493 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:15.493 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib_routing_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:15.493 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:15.493 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:15.493 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib_nexthop_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:15.493 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/learner.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:15.493 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_macswp.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:15.493 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/registers.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:15.493 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_pcap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:15.493 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib_nexthop_group_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:15.493 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:15.493 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/registers.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:15.493 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan_pcap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:15.493 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/packet.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:15.493 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/selector.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:15.493 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/meter.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:15.493 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/hash_func.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:15.493 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/pcap.io to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:15.493 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/ipsec.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:15.493 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan_table.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:15.494 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/ipsec_sa.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:15.494 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/learner.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:15.494 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:15.494 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/selector.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:15.494 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/meter.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:15.494 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_macswp_pcap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:15.494 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:02:15.494 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/rte_policer.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:02:15.494 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:02:15.494 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:02:15.494 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/rte_policer.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:02:15.494 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool 00:02:15.494 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/ethapp.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:02:15.494 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/ethapp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:02:15.494 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:02:15.494 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:02:15.494 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/lib/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/lib 00:02:15.494 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/lib/rte_ethtool.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/lib 00:02:15.494 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/lib/rte_ethtool.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/lib 00:02:15.494 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ntb/commands.list to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ntb 00:02:15.494 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ntb/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ntb 00:02:15.494 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ntb/ntb_fwd.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ntb 00:02:15.494 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_crypto/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_crypto 00:02:15.494 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_crypto/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_crypto 00:02:15.494 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/timer/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/timer 00:02:15.494 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/timer/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/timer 00:02:15.494 Installing lib/librte_log.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.494 Installing lib/librte_log.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.494 Installing lib/librte_kvargs.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.494 Installing lib/librte_kvargs.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.494 Installing lib/librte_telemetry.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.494 Installing lib/librte_telemetry.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.494 Installing lib/librte_eal.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.494 Installing lib/librte_eal.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.494 Installing lib/librte_ring.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.494 Installing lib/librte_ring.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.494 Installing lib/librte_rcu.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.494 Installing lib/librte_rcu.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.494 Installing lib/librte_mempool.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.494 Installing lib/librte_mempool.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.494 Installing lib/librte_mbuf.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.494 Installing lib/librte_mbuf.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.494 Installing lib/librte_net.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.494 Installing lib/librte_net.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.494 Installing lib/librte_meter.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.494 Installing lib/librte_meter.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.494 Installing lib/librte_ethdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.494 Installing lib/librte_ethdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.494 Installing lib/librte_pci.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.494 Installing lib/librte_pci.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.494 Installing lib/librte_cmdline.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.494 Installing lib/librte_cmdline.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.494 Installing lib/librte_metrics.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.494 Installing lib/librte_metrics.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.494 Installing lib/librte_hash.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.494 Installing lib/librte_hash.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.494 Installing lib/librte_timer.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.494 Installing lib/librte_timer.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.494 Installing lib/librte_acl.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.494 Installing lib/librte_acl.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.494 Installing lib/librte_bbdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.494 Installing lib/librte_bbdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.494 Installing lib/librte_bitratestats.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.494 Installing lib/librte_bitratestats.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.494 Installing lib/librte_bpf.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.494 Installing lib/librte_bpf.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.494 Installing lib/librte_cfgfile.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.494 Installing lib/librte_cfgfile.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.494 Installing lib/librte_compressdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.494 Installing lib/librte_compressdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.494 Installing lib/librte_cryptodev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.494 Installing lib/librte_cryptodev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.494 Installing lib/librte_distributor.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.494 Installing lib/librte_distributor.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.494 Installing lib/librte_dmadev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.494 Installing lib/librte_dmadev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.494 Installing lib/librte_efd.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.494 Installing lib/librte_efd.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.494 Installing lib/librte_eventdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.494 Installing lib/librte_eventdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.494 Installing lib/librte_dispatcher.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.494 Installing lib/librte_dispatcher.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.494 Installing lib/librte_gpudev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.494 Installing lib/librte_gpudev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.494 Installing lib/librte_gro.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.494 Installing lib/librte_gro.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.494 Installing lib/librte_gso.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.494 Installing lib/librte_gso.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.494 Installing lib/librte_ip_frag.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.494 Installing lib/librte_ip_frag.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.494 Installing lib/librte_jobstats.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.494 Installing lib/librte_jobstats.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.494 Installing lib/librte_latencystats.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.494 Installing lib/librte_latencystats.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.756 Installing lib/librte_lpm.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.756 Installing lib/librte_lpm.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.756 Installing lib/librte_member.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.756 Installing lib/librte_member.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.756 Installing lib/librte_pcapng.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.756 Installing lib/librte_pcapng.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.756 Installing lib/librte_power.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.756 Installing lib/librte_power.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.756 Installing lib/librte_rawdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.756 Installing lib/librte_rawdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.756 Installing lib/librte_regexdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.756 Installing lib/librte_regexdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.756 Installing lib/librte_mldev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.756 Installing lib/librte_mldev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.756 Installing lib/librte_rib.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.756 Installing lib/librte_rib.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.756 Installing lib/librte_reorder.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.756 Installing lib/librte_reorder.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.756 Installing lib/librte_sched.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.756 Installing lib/librte_sched.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.756 Installing lib/librte_security.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.756 Installing lib/librte_security.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.756 Installing lib/librte_stack.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.756 Installing lib/librte_stack.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.756 Installing lib/librte_vhost.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.756 Installing lib/librte_vhost.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.756 Installing lib/librte_ipsec.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.756 Installing lib/librte_ipsec.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.756 Installing lib/librte_pdcp.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.756 Installing lib/librte_pdcp.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.756 Installing lib/librte_fib.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.756 Installing lib/librte_fib.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.756 Installing lib/librte_port.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.756 Installing lib/librte_port.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.756 Installing lib/librte_pdump.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.756 Installing lib/librte_pdump.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.756 Installing lib/librte_table.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.756 Installing lib/librte_table.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.756 Installing lib/librte_pipeline.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.756 Installing lib/librte_pipeline.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.756 Installing lib/librte_graph.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.756 Installing lib/librte_graph.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.756 Installing lib/librte_node.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.756 Installing lib/librte_node.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.756 Installing drivers/librte_bus_pci.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.756 Installing drivers/librte_bus_pci.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0 00:02:15.756 Installing drivers/librte_bus_vdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.756 Installing drivers/librte_bus_vdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0 00:02:15.756 Installing drivers/librte_mempool_ring.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.756 Installing drivers/librte_mempool_ring.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0 00:02:15.756 Installing drivers/librte_net_i40e.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:15.756 Installing drivers/librte_net_i40e.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0 00:02:15.756 Installing app/dpdk-dumpcap to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:15.756 Installing app/dpdk-graph to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:15.756 Installing app/dpdk-pdump to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:15.756 Installing app/dpdk-proc-info to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:15.756 Installing app/dpdk-test-acl to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:15.756 Installing app/dpdk-test-bbdev to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:15.756 Installing app/dpdk-test-cmdline to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:15.756 Installing app/dpdk-test-compress-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:15.756 Installing app/dpdk-test-crypto-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:15.756 Installing app/dpdk-test-dma-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:15.757 Installing app/dpdk-test-eventdev to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:15.757 Installing app/dpdk-test-fib to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:15.757 Installing app/dpdk-test-flow-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:15.757 Installing app/dpdk-test-gpudev to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:15.757 Installing app/dpdk-test-mldev to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:15.757 Installing app/dpdk-test-pipeline to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:15.757 Installing app/dpdk-testpmd to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:15.757 Installing app/dpdk-test-regex to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:15.757 Installing app/dpdk-test-sad to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:15.757 Installing app/dpdk-test-security-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:15.757 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/config/rte_config.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.757 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/log/rte_log.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.757 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/kvargs/rte_kvargs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.757 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/telemetry/rte_telemetry.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.757 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_atomic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:15.757 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_byteorder.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:15.757 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_cpuflags.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:15.757 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_cycles.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:15.757 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_io.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:15.757 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_memcpy.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:15.757 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_pause.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:15.757 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_power_intrinsics.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:15.757 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_prefetch.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:15.757 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_rwlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:15.757 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_spinlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:15.757 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_vect.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:15.757 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_atomic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.757 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_byteorder.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.757 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_cpuflags.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.757 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_cycles.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.757 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_io.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.757 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_memcpy.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.757 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_pause.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.757 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_power_intrinsics.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.757 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_prefetch.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.757 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_rtm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.757 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_rwlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.757 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_spinlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.757 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_vect.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.757 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_atomic_32.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.757 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_atomic_64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.757 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_byteorder_32.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.757 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_byteorder_64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.757 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_alarm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.757 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_bitmap.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.757 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_bitops.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.757 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_branch_prediction.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.757 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_bus.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.757 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_class.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.757 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.757 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_compat.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.757 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_debug.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.757 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_dev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.757 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_devargs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.757 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_eal.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.757 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_eal_memconfig.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.757 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_eal_trace.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.757 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_errno.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.757 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_epoll.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.757 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_fbarray.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.757 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_hexdump.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.757 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_hypervisor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.757 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_interrupts.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.757 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_keepalive.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.757 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_launch.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.757 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_lcore.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.757 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_lock_annotations.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.757 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_malloc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.757 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_mcslock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.757 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_memory.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.757 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_memzone.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.757 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_pci_dev_feature_defs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.758 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_pci_dev_features.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.758 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_per_lcore.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.758 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_pflock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.758 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_random.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.758 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_reciprocal.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.758 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_seqcount.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.758 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_seqlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.758 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_service.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.758 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_service_component.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.758 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_stdatomic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.758 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_string_fns.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.758 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_tailq.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.758 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_thread.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.758 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_ticketlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.758 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_time.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.758 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_trace.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.758 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_trace_point.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.758 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_trace_point_register.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.758 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_uuid.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.758 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_version.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.758 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_vfio.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.758 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/linux/include/rte_os.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.758 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.758 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.758 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_elem.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.758 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.758 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_c11_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.758 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_generic_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.758 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_hts.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.758 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_hts_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.758 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_peek.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.758 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_peek_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.758 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_peek_zc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.758 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_rts.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.758 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_rts_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.758 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rcu/rte_rcu_qsbr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.758 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mempool/rte_mempool.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.758 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mempool/rte_mempool_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.758 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.758 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.758 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_ptype.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.758 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_pool_ops.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.758 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_dyn.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.758 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ip.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.758 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_tcp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.758 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_udp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.758 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_tls.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.758 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_dtls.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.758 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_esp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.758 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_sctp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.758 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_icmp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.758 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_arp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.758 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ether.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.758 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_macsec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.758 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_vxlan.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.758 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_gre.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.758 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_gtp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.758 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_net.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.758 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_net_crc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.758 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_mpls.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.758 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_higig.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.758 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ecpri.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.758 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_pdcp_hdr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.758 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_geneve.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.758 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_l2tpv2.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.758 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ppp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.758 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ib.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.758 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/meter/rte_meter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.758 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_cman.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.758 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.758 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_ethdev_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.758 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_dev_info.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.758 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_flow.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.758 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_flow_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.759 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_mtr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.759 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_mtr_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.759 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_tm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.759 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_tm_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.759 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_ethdev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.759 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_eth_ctrl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.759 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pci/rte_pci.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.759 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.759 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.759 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_num.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.759 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_ipaddr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.759 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_etheraddr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.759 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_string.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.759 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_rdline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.759 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_vt100.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.759 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_socket.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.759 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_cirbuf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.759 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_portlist.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.759 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/metrics/rte_metrics.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.759 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/metrics/rte_metrics_telemetry.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.759 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_fbk_hash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.759 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_hash_crc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.759 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_hash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.759 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_jhash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.759 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_thash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.759 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_thash_gfni.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.759 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_arm64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.759 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_generic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.759 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_sw.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.759 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_x86.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.759 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_thash_x86_gfni.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.759 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/timer/rte_timer.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.759 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/acl/rte_acl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.759 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/acl/rte_acl_osdep.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.759 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bbdev/rte_bbdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.759 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bbdev/rte_bbdev_pmd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.759 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bbdev/rte_bbdev_op.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.759 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bitratestats/rte_bitrate.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.759 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bpf/bpf_def.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.759 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bpf/rte_bpf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.759 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bpf/rte_bpf_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.759 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cfgfile/rte_cfgfile.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.759 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/compressdev/rte_compressdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.759 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/compressdev/rte_comp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.759 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.759 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.759 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_crypto.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.759 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_crypto_sym.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.759 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_crypto_asym.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.759 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.759 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/distributor/rte_distributor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.759 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/dmadev/rte_dmadev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.759 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/dmadev/rte_dmadev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.759 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/efd/rte_efd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.759 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_crypto_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.759 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_dma_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.759 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_eth_rx_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.759 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_eth_tx_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.759 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.759 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_timer_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.759 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_eventdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.759 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_eventdev_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.759 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_eventdev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.759 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/dispatcher/rte_dispatcher.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.759 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/gpudev/rte_gpudev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.759 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/gro/rte_gro.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.759 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/gso/rte_gso.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.759 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ip_frag/rte_ip_frag.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.759 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/jobstats/rte_jobstats.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.759 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/latencystats/rte_latencystats.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.760 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.760 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.760 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_altivec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.760 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.760 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_scalar.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.760 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.760 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_sve.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.760 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/member/rte_member.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.760 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pcapng/rte_pcapng.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.760 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.760 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power_guest_channel.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.760 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power_pmd_mgmt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.760 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power_uncore.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.760 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rawdev/rte_rawdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.760 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rawdev/rte_rawdev_pmd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.760 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/regexdev/rte_regexdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.760 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/regexdev/rte_regexdev_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.760 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/regexdev/rte_regexdev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.760 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mldev/rte_mldev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.760 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mldev/rte_mldev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.760 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rib/rte_rib.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.760 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rib/rte_rib6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.760 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/reorder/rte_reorder.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.760 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_approx.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.760 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_red.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.760 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_sched.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.760 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_sched_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.760 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_pie.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.760 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/security/rte_security.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.760 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/security/rte_security_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.760 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.760 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_std.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.760 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.760 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf_generic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.760 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf_c11.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.760 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf_stubs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.760 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vdpa.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.760 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vhost.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.760 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vhost_async.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.760 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vhost_crypto.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.760 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.760 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec_sa.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.760 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec_sad.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.760 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.760 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pdcp/rte_pdcp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.760 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pdcp/rte_pdcp_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.760 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/fib/rte_fib.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.760 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/fib/rte_fib6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.760 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.760 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_fd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.760 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_frag.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.760 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_ras.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.760 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.760 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.760 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_sched.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.760 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_source_sink.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.760 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_sym_crypto.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.760 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_eventdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.760 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.760 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.760 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_fd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.760 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.760 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_source_sink.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.760 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pdump/rte_pdump.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.760 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_lru.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.760 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_hash_func.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.760 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.760 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_em.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.760 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_learner.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.760 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_selector.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.760 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_wm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.760 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.760 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_acl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.760 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_array.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.760 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.760 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash_cuckoo.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.761 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash_func.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.761 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_lpm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.761 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_lpm_ipv6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.761 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_stub.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.761 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_lru_arm64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.761 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_lru_x86.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.761 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash_func_arm64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.761 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_pipeline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.761 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_port_in_action.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.761 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_table_action.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.761 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_swx_ipsec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.761 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_swx_pipeline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.761 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_swx_extern.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.761 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_swx_ctl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.761 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/graph/rte_graph.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.761 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/graph/rte_graph_worker.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.761 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/graph/rte_graph_model_mcore_dispatch.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.761 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/graph/rte_graph_model_rtc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.761 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/graph/rte_graph_worker_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.761 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/node/rte_node_eth_api.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.761 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/node/rte_node_ip4_api.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.761 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/node/rte_node_ip6_api.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.761 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/node/rte_node_udp4_input_api.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.761 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/drivers/bus/pci/rte_bus_pci.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.761 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/drivers/bus/vdev/rte_bus_vdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.761 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/drivers/net/i40e/rte_pmd_i40e.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.761 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/buildtools/dpdk-cmdline-gen.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:15.761 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-devbind.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:15.761 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-pmdinfo.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:15.761 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-telemetry.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:15.761 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-hugepages.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:15.761 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-rss-flows.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:15.761 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp/rte_build_config.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:15.761 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp/meson-private/libdpdk-libs.pc to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig 00:02:15.761 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp/meson-private/libdpdk.pc to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig 00:02:15.761 Installing symlink pointing to librte_log.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_log.so.24 00:02:15.761 Installing symlink pointing to librte_log.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_log.so 00:02:15.761 Installing symlink pointing to librte_kvargs.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_kvargs.so.24 00:02:15.761 Installing symlink pointing to librte_kvargs.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_kvargs.so 00:02:15.761 Installing symlink pointing to librte_telemetry.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_telemetry.so.24 00:02:15.761 Installing symlink pointing to librte_telemetry.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_telemetry.so 00:02:15.761 Installing symlink pointing to librte_eal.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eal.so.24 00:02:15.761 Installing symlink pointing to librte_eal.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eal.so 00:02:15.761 Installing symlink pointing to librte_ring.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ring.so.24 00:02:15.761 Installing symlink pointing to librte_ring.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ring.so 00:02:15.761 Installing symlink pointing to librte_rcu.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rcu.so.24 00:02:15.761 Installing symlink pointing to librte_rcu.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rcu.so 00:02:15.761 Installing symlink pointing to librte_mempool.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mempool.so.24 00:02:15.761 Installing symlink pointing to librte_mempool.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mempool.so 00:02:15.761 Installing symlink pointing to librte_mbuf.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mbuf.so.24 00:02:15.761 Installing symlink pointing to librte_mbuf.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mbuf.so 00:02:15.761 Installing symlink pointing to librte_net.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_net.so.24 00:02:15.761 Installing symlink pointing to librte_net.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_net.so 00:02:15.761 Installing symlink pointing to librte_meter.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_meter.so.24 00:02:15.761 Installing symlink pointing to librte_meter.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_meter.so 00:02:15.761 Installing symlink pointing to librte_ethdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ethdev.so.24 00:02:15.761 Installing symlink pointing to librte_ethdev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ethdev.so 00:02:15.761 Installing symlink pointing to librte_pci.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pci.so.24 00:02:15.761 Installing symlink pointing to librte_pci.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pci.so 00:02:15.761 Installing symlink pointing to librte_cmdline.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cmdline.so.24 00:02:15.762 Installing symlink pointing to librte_cmdline.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cmdline.so 00:02:15.762 Installing symlink pointing to librte_metrics.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_metrics.so.24 00:02:15.762 Installing symlink pointing to librte_metrics.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_metrics.so 00:02:15.762 Installing symlink pointing to librte_hash.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_hash.so.24 00:02:15.762 Installing symlink pointing to librte_hash.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_hash.so 00:02:15.762 Installing symlink pointing to librte_timer.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_timer.so.24 00:02:15.762 Installing symlink pointing to librte_timer.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_timer.so 00:02:15.762 Installing symlink pointing to librte_acl.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_acl.so.24 00:02:15.762 Installing symlink pointing to librte_acl.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_acl.so 00:02:15.762 Installing symlink pointing to librte_bbdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bbdev.so.24 00:02:15.762 Installing symlink pointing to librte_bbdev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bbdev.so 00:02:15.762 Installing symlink pointing to librte_bitratestats.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bitratestats.so.24 00:02:15.762 Installing symlink pointing to librte_bitratestats.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bitratestats.so 00:02:15.762 Installing symlink pointing to librte_bpf.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bpf.so.24 00:02:15.762 Installing symlink pointing to librte_bpf.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bpf.so 00:02:15.762 Installing symlink pointing to librte_cfgfile.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cfgfile.so.24 00:02:15.762 Installing symlink pointing to librte_cfgfile.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cfgfile.so 00:02:15.762 Installing symlink pointing to librte_compressdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_compressdev.so.24 00:02:15.762 Installing symlink pointing to librte_compressdev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_compressdev.so 00:02:15.762 Installing symlink pointing to librte_cryptodev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cryptodev.so.24 00:02:15.762 Installing symlink pointing to librte_cryptodev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cryptodev.so 00:02:15.762 Installing symlink pointing to librte_distributor.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_distributor.so.24 00:02:15.762 Installing symlink pointing to librte_distributor.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_distributor.so 00:02:15.762 Installing symlink pointing to librte_dmadev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_dmadev.so.24 00:02:15.762 Installing symlink pointing to librte_dmadev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_dmadev.so 00:02:15.762 Installing symlink pointing to librte_efd.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_efd.so.24 00:02:15.762 Installing symlink pointing to librte_efd.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_efd.so 00:02:15.762 Installing symlink pointing to librte_eventdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eventdev.so.24 00:02:15.762 Installing symlink pointing to librte_eventdev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eventdev.so 00:02:15.762 Installing symlink pointing to librte_dispatcher.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_dispatcher.so.24 00:02:15.762 Installing symlink pointing to librte_dispatcher.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_dispatcher.so 00:02:15.762 Installing symlink pointing to librte_gpudev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gpudev.so.24 00:02:15.762 Installing symlink pointing to librte_gpudev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gpudev.so 00:02:15.762 Installing symlink pointing to librte_gro.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gro.so.24 00:02:15.762 Installing symlink pointing to librte_gro.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gro.so 00:02:15.762 Installing symlink pointing to librte_gso.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gso.so.24 00:02:15.762 Installing symlink pointing to librte_gso.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gso.so 00:02:15.762 Installing symlink pointing to librte_ip_frag.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ip_frag.so.24 00:02:15.762 Installing symlink pointing to librte_ip_frag.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ip_frag.so 00:02:15.762 Installing symlink pointing to librte_jobstats.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_jobstats.so.24 00:02:15.762 Installing symlink pointing to librte_jobstats.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_jobstats.so 00:02:15.762 Installing symlink pointing to librte_latencystats.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_latencystats.so.24 00:02:15.762 Installing symlink pointing to librte_latencystats.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_latencystats.so 00:02:15.762 Installing symlink pointing to librte_lpm.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_lpm.so.24 00:02:15.762 Installing symlink pointing to librte_lpm.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_lpm.so 00:02:15.762 Installing symlink pointing to librte_member.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_member.so.24 00:02:15.762 Installing symlink pointing to librte_member.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_member.so 00:02:15.762 Installing symlink pointing to librte_pcapng.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pcapng.so.24 00:02:15.762 Installing symlink pointing to librte_pcapng.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pcapng.so 00:02:15.762 Installing symlink pointing to librte_power.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_power.so.24 00:02:15.762 Installing symlink pointing to librte_power.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_power.so 00:02:15.762 Installing symlink pointing to librte_rawdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rawdev.so.24 00:02:15.762 Installing symlink pointing to librte_rawdev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rawdev.so 00:02:15.762 Installing symlink pointing to librte_regexdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_regexdev.so.24 00:02:15.762 Installing symlink pointing to librte_regexdev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_regexdev.so 00:02:15.762 Installing symlink pointing to librte_mldev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mldev.so.24 00:02:15.762 Installing symlink pointing to librte_mldev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mldev.so 00:02:15.762 Installing symlink pointing to librte_rib.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rib.so.24 00:02:15.762 Installing symlink pointing to librte_rib.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rib.so 00:02:15.762 Installing symlink pointing to librte_reorder.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_reorder.so.24 00:02:15.762 Installing symlink pointing to librte_reorder.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_reorder.so 00:02:15.762 Installing symlink pointing to librte_sched.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_sched.so.24 00:02:15.762 Installing symlink pointing to librte_sched.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_sched.so 00:02:15.762 Installing symlink pointing to librte_security.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_security.so.24 00:02:15.762 Installing symlink pointing to librte_security.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_security.so 00:02:15.762 Installing symlink pointing to librte_stack.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_stack.so.24 00:02:15.762 Installing symlink pointing to librte_stack.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_stack.so 00:02:15.762 Installing symlink pointing to librte_vhost.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_vhost.so.24 00:02:15.762 Installing symlink pointing to librte_vhost.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_vhost.so 00:02:15.762 Installing symlink pointing to librte_ipsec.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ipsec.so.24 00:02:15.762 Installing symlink pointing to librte_ipsec.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ipsec.so 00:02:15.762 Installing symlink pointing to librte_pdcp.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pdcp.so.24 00:02:15.762 Installing symlink pointing to librte_pdcp.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pdcp.so 00:02:15.762 Installing symlink pointing to librte_fib.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_fib.so.24 00:02:15.762 Installing symlink pointing to librte_fib.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_fib.so 00:02:15.762 Installing symlink pointing to librte_port.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_port.so.24 00:02:15.762 Installing symlink pointing to librte_port.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_port.so 00:02:15.762 Installing symlink pointing to librte_pdump.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pdump.so.24 00:02:15.762 Installing symlink pointing to librte_pdump.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pdump.so 00:02:15.762 Installing symlink pointing to librte_table.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_table.so.24 00:02:15.762 Installing symlink pointing to librte_table.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_table.so 00:02:15.763 Installing symlink pointing to librte_pipeline.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pipeline.so.24 00:02:15.763 Installing symlink pointing to librte_pipeline.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pipeline.so 00:02:15.763 Installing symlink pointing to librte_graph.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_graph.so.24 00:02:15.763 Installing symlink pointing to librte_graph.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_graph.so 00:02:15.763 Installing symlink pointing to librte_node.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_node.so.24 00:02:15.763 Installing symlink pointing to librte_node.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_node.so 00:02:15.763 './librte_bus_pci.so' -> 'dpdk/pmds-24.0/librte_bus_pci.so' 00:02:15.763 './librte_bus_pci.so.24' -> 'dpdk/pmds-24.0/librte_bus_pci.so.24' 00:02:15.763 './librte_bus_pci.so.24.0' -> 'dpdk/pmds-24.0/librte_bus_pci.so.24.0' 00:02:15.763 './librte_bus_vdev.so' -> 'dpdk/pmds-24.0/librte_bus_vdev.so' 00:02:15.763 './librte_bus_vdev.so.24' -> 'dpdk/pmds-24.0/librte_bus_vdev.so.24' 00:02:15.763 './librte_bus_vdev.so.24.0' -> 'dpdk/pmds-24.0/librte_bus_vdev.so.24.0' 00:02:15.763 './librte_mempool_ring.so' -> 'dpdk/pmds-24.0/librte_mempool_ring.so' 00:02:15.763 './librte_mempool_ring.so.24' -> 'dpdk/pmds-24.0/librte_mempool_ring.so.24' 00:02:15.763 './librte_mempool_ring.so.24.0' -> 'dpdk/pmds-24.0/librte_mempool_ring.so.24.0' 00:02:15.763 './librte_net_i40e.so' -> 'dpdk/pmds-24.0/librte_net_i40e.so' 00:02:15.763 './librte_net_i40e.so.24' -> 'dpdk/pmds-24.0/librte_net_i40e.so.24' 00:02:15.763 './librte_net_i40e.so.24.0' -> 'dpdk/pmds-24.0/librte_net_i40e.so.24.0' 00:02:15.763 Installing symlink pointing to librte_bus_pci.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_bus_pci.so.24 00:02:15.763 Installing symlink pointing to librte_bus_pci.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_bus_pci.so 00:02:15.763 Installing symlink pointing to librte_bus_vdev.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_bus_vdev.so.24 00:02:15.763 Installing symlink pointing to librte_bus_vdev.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_bus_vdev.so 00:02:15.763 Installing symlink pointing to librte_mempool_ring.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_mempool_ring.so.24 00:02:15.763 Installing symlink pointing to librte_mempool_ring.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_mempool_ring.so 00:02:15.763 Installing symlink pointing to librte_net_i40e.so.24.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_net_i40e.so.24 00:02:15.763 Installing symlink pointing to librte_net_i40e.so.24 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-24.0/librte_net_i40e.so 00:02:15.763 Running custom install script '/bin/sh /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/config/../buildtools/symlink-drivers-solibs.sh lib dpdk/pmds-24.0' 00:02:15.763 10:33:32 -- common/autobuild_common.sh@189 -- $ uname -s 00:02:16.022 10:33:32 -- common/autobuild_common.sh@189 -- $ [[ Linux == \F\r\e\e\B\S\D ]] 00:02:16.022 10:33:32 -- common/autobuild_common.sh@200 -- $ cat 00:02:16.022 10:33:32 -- common/autobuild_common.sh@205 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:02:16.022 00:02:16.022 real 0m26.959s 00:02:16.022 user 7m59.174s 00:02:16.022 sys 2m26.737s 00:02:16.022 10:33:32 -- common/autotest_common.sh@1105 -- $ xtrace_disable 00:02:16.022 10:33:32 -- common/autotest_common.sh@10 -- $ set +x 00:02:16.022 ************************************ 00:02:16.022 END TEST build_native_dpdk 00:02:16.022 ************************************ 00:02:16.022 10:33:32 -- spdk/autobuild.sh@31 -- $ case "$SPDK_TEST_AUTOBUILD" in 00:02:16.022 10:33:32 -- spdk/autobuild.sh@47 -- $ [[ 0 -eq 1 ]] 00:02:16.022 10:33:32 -- spdk/autobuild.sh@51 -- $ [[ 1 -eq 1 ]] 00:02:16.022 10:33:32 -- spdk/autobuild.sh@52 -- $ llvm_precompile 00:02:16.022 10:33:32 -- common/autobuild_common.sh@423 -- $ run_test autobuild_llvm_precompile _llvm_precompile 00:02:16.022 10:33:32 -- common/autotest_common.sh@1077 -- $ '[' 2 -le 1 ']' 00:02:16.022 10:33:32 -- common/autotest_common.sh@1083 -- $ xtrace_disable 00:02:16.022 10:33:32 -- common/autotest_common.sh@10 -- $ set +x 00:02:16.022 ************************************ 00:02:16.022 START TEST autobuild_llvm_precompile 00:02:16.022 ************************************ 00:02:16.022 10:33:32 -- common/autotest_common.sh@1104 -- $ _llvm_precompile 00:02:16.022 10:33:32 -- common/autobuild_common.sh@32 -- $ clang --version 00:02:16.022 10:33:32 -- common/autobuild_common.sh@32 -- $ [[ clang version 16.0.6 (Fedora 16.0.6-3.fc38) 00:02:16.022 Target: x86_64-redhat-linux-gnu 00:02:16.022 Thread model: posix 00:02:16.022 InstalledDir: /usr/bin =~ version (([0-9]+).([0-9]+).([0-9]+)) ]] 00:02:16.022 10:33:32 -- common/autobuild_common.sh@33 -- $ clang_num=16 00:02:16.022 10:33:32 -- common/autobuild_common.sh@35 -- $ export CC=clang-16 00:02:16.022 10:33:32 -- common/autobuild_common.sh@35 -- $ CC=clang-16 00:02:16.022 10:33:32 -- common/autobuild_common.sh@36 -- $ export CXX=clang++-16 00:02:16.022 10:33:32 -- common/autobuild_common.sh@36 -- $ CXX=clang++-16 00:02:16.022 10:33:32 -- common/autobuild_common.sh@38 -- $ fuzzer_libs=(/usr/lib*/clang/@("$clang_num"|"$clang_version")/lib/linux/libclang_rt.fuzzer_no_main-x86_64.a) 00:02:16.022 10:33:32 -- common/autobuild_common.sh@39 -- $ fuzzer_lib=/usr/lib64/clang/16/lib/linux/libclang_rt.fuzzer_no_main-x86_64.a 00:02:16.022 10:33:32 -- common/autobuild_common.sh@40 -- $ [[ -e /usr/lib64/clang/16/lib/linux/libclang_rt.fuzzer_no_main-x86_64.a ]] 00:02:16.022 10:33:32 -- common/autobuild_common.sh@42 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user --with-fuzzer=/usr/lib64/clang/16/lib/linux/libclang_rt.fuzzer_no_main-x86_64.a' 00:02:16.022 10:33:32 -- common/autobuild_common.sh@44 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user --with-fuzzer=/usr/lib64/clang/16/lib/linux/libclang_rt.fuzzer_no_main-x86_64.a 00:02:16.280 Using /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig for additional libs... 00:02:16.280 DPDK libraries: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:16.280 DPDK includes: //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:16.280 Using default SPDK env in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:02:16.849 Using 'verbs' RDMA provider 00:02:32.296 Configuring ISA-L (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/isa-l/spdk-isal.log)...done. 00:02:44.626 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/isa-l-crypto/spdk-isal-crypto.log)...done. 00:02:44.886 Creating mk/config.mk...done. 00:02:44.886 Creating mk/cc.flags.mk...done. 00:02:44.886 Type 'make' to build. 00:02:44.886 00:02:44.886 real 0m28.952s 00:02:44.886 user 0m12.388s 00:02:44.886 sys 0m15.890s 00:02:44.886 10:34:01 -- common/autotest_common.sh@1105 -- $ xtrace_disable 00:02:44.886 10:34:01 -- common/autotest_common.sh@10 -- $ set +x 00:02:44.886 ************************************ 00:02:44.886 END TEST autobuild_llvm_precompile 00:02:44.886 ************************************ 00:02:44.886 10:34:01 -- spdk/autobuild.sh@55 -- $ [[ -n '' ]] 00:02:44.886 10:34:01 -- spdk/autobuild.sh@57 -- $ [[ 0 -eq 1 ]] 00:02:44.886 10:34:01 -- spdk/autobuild.sh@59 -- $ [[ 0 -eq 1 ]] 00:02:44.886 10:34:01 -- spdk/autobuild.sh@62 -- $ [[ 1 -eq 1 ]] 00:02:44.886 10:34:01 -- spdk/autobuild.sh@64 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user --with-fuzzer=/usr/lib64/clang/16/lib/linux/libclang_rt.fuzzer_no_main-x86_64.a 00:02:45.146 Using /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig for additional libs... 00:02:45.146 DPDK libraries: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:45.146 DPDK includes: //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:45.405 Using default SPDK env in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:02:45.663 Using 'verbs' RDMA provider 00:02:58.443 Configuring ISA-L (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/isa-l/spdk-isal.log)...done. 00:03:10.666 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/isa-l-crypto/spdk-isal-crypto.log)...done. 00:03:10.666 Creating mk/config.mk...done. 00:03:10.666 Creating mk/cc.flags.mk...done. 00:03:10.666 Type 'make' to build. 00:03:10.666 10:34:25 -- spdk/autobuild.sh@69 -- $ run_test make make -j112 00:03:10.666 10:34:25 -- common/autotest_common.sh@1077 -- $ '[' 3 -le 1 ']' 00:03:10.666 10:34:25 -- common/autotest_common.sh@1083 -- $ xtrace_disable 00:03:10.666 10:34:25 -- common/autotest_common.sh@10 -- $ set +x 00:03:10.666 ************************************ 00:03:10.666 START TEST make 00:03:10.666 ************************************ 00:03:10.666 10:34:25 -- common/autotest_common.sh@1104 -- $ make -j112 00:03:10.666 make[1]: Nothing to be done for 'all'. 00:03:11.601 The Meson build system 00:03:11.601 Version: 1.3.1 00:03:11.601 Source dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user 00:03:11.601 Build dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:03:11.601 Build type: native build 00:03:11.601 Project name: libvfio-user 00:03:11.601 Project version: 0.0.1 00:03:11.601 C compiler for the host machine: clang-16 (clang 16.0.6 "clang version 16.0.6 (Fedora 16.0.6-3.fc38)") 00:03:11.601 C linker for the host machine: clang-16 ld.bfd 2.39-16 00:03:11.601 Host machine cpu family: x86_64 00:03:11.601 Host machine cpu: x86_64 00:03:11.601 Run-time dependency threads found: YES 00:03:11.601 Library dl found: YES 00:03:11.601 Found pkg-config: YES (/usr/bin/pkg-config) 1.8.0 00:03:11.601 Run-time dependency json-c found: YES 0.17 00:03:11.601 Run-time dependency cmocka found: YES 1.1.7 00:03:11.601 Program pytest-3 found: NO 00:03:11.601 Program flake8 found: NO 00:03:11.601 Program misspell-fixer found: NO 00:03:11.601 Program restructuredtext-lint found: NO 00:03:11.601 Program valgrind found: YES (/usr/bin/valgrind) 00:03:11.601 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:03:11.601 Compiler for C supports arguments -Wmissing-declarations: YES 00:03:11.601 Compiler for C supports arguments -Wwrite-strings: YES 00:03:11.601 ../libvfio-user/test/meson.build:20: WARNING: Project targets '>= 0.53.0' but uses feature introduced in '0.57.0': exclude_suites arg in add_test_setup. 00:03:11.601 Program test-lspci.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user/test/test-lspci.sh) 00:03:11.601 Program test-linkage.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user/test/test-linkage.sh) 00:03:11.601 ../libvfio-user/test/py/meson.build:16: WARNING: Project targets '>= 0.53.0' but uses feature introduced in '0.57.0': exclude_suites arg in add_test_setup. 00:03:11.601 Build targets in project: 8 00:03:11.601 WARNING: Project specifies a minimum meson_version '>= 0.53.0' but uses features which were added in newer versions: 00:03:11.601 * 0.57.0: {'exclude_suites arg in add_test_setup'} 00:03:11.601 00:03:11.601 libvfio-user 0.0.1 00:03:11.601 00:03:11.601 User defined options 00:03:11.601 buildtype : debug 00:03:11.601 default_library: static 00:03:11.601 libdir : /usr/local/lib 00:03:11.601 00:03:11.601 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:03:11.860 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug' 00:03:11.860 [1/36] Compiling C object samples/lspci.p/lspci.c.o 00:03:11.860 [2/36] Compiling C object lib/libvfio-user.a.p/irq.c.o 00:03:11.860 [3/36] Compiling C object samples/null.p/null.c.o 00:03:11.860 [4/36] Compiling C object lib/libvfio-user.a.p/tran.c.o 00:03:11.860 [5/36] Compiling C object samples/client.p/.._lib_tran.c.o 00:03:11.860 [6/36] Compiling C object lib/libvfio-user.a.p/migration.c.o 00:03:11.860 [7/36] Compiling C object samples/shadow_ioeventfd_server.p/shadow_ioeventfd_server.c.o 00:03:11.860 [8/36] Compiling C object samples/client.p/.._lib_migration.c.o 00:03:11.860 [9/36] Compiling C object test/unit_tests.p/.._lib_tran.c.o 00:03:11.860 [10/36] Compiling C object samples/gpio-pci-idio-16.p/gpio-pci-idio-16.c.o 00:03:11.860 [11/36] Compiling C object lib/libvfio-user.a.p/pci.c.o 00:03:11.860 [12/36] Compiling C object lib/libvfio-user.a.p/tran_sock.c.o 00:03:11.860 [13/36] Compiling C object test/unit_tests.p/.._lib_irq.c.o 00:03:11.860 [14/36] Compiling C object lib/libvfio-user.a.p/pci_caps.c.o 00:03:11.860 [15/36] Compiling C object test/unit_tests.p/.._lib_migration.c.o 00:03:11.860 [16/36] Compiling C object lib/libvfio-user.a.p/dma.c.o 00:03:11.860 [17/36] Compiling C object test/unit_tests.p/.._lib_pci.c.o 00:03:11.860 [18/36] Compiling C object test/unit_tests.p/.._lib_tran_pipe.c.o 00:03:11.860 [19/36] Compiling C object test/unit_tests.p/mocks.c.o 00:03:11.860 [20/36] Compiling C object test/unit_tests.p/.._lib_dma.c.o 00:03:11.860 [21/36] Compiling C object samples/client.p/.._lib_tran_sock.c.o 00:03:11.860 [22/36] Compiling C object samples/server.p/server.c.o 00:03:11.860 [23/36] Compiling C object test/unit_tests.p/.._lib_tran_sock.c.o 00:03:11.860 [24/36] Compiling C object test/unit_tests.p/.._lib_pci_caps.c.o 00:03:11.860 [25/36] Compiling C object test/unit_tests.p/unit-tests.c.o 00:03:11.860 [26/36] Compiling C object samples/client.p/client.c.o 00:03:11.860 [27/36] Compiling C object lib/libvfio-user.a.p/libvfio-user.c.o 00:03:11.860 [28/36] Compiling C object test/unit_tests.p/.._lib_libvfio-user.c.o 00:03:11.860 [29/36] Linking static target lib/libvfio-user.a 00:03:11.860 [30/36] Linking target samples/client 00:03:11.860 [31/36] Linking target test/unit_tests 00:03:12.120 [32/36] Linking target samples/server 00:03:12.120 [33/36] Linking target samples/null 00:03:12.120 [34/36] Linking target samples/shadow_ioeventfd_server 00:03:12.120 [35/36] Linking target samples/lspci 00:03:12.120 [36/36] Linking target samples/gpio-pci-idio-16 00:03:12.120 INFO: autodetecting backend as ninja 00:03:12.120 INFO: calculating backend command to run: /usr/local/bin/ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:03:12.120 DESTDIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user meson install --quiet -C /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:03:12.378 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug' 00:03:12.378 ninja: no work to do. 00:03:15.669 CC lib/log/log.o 00:03:15.669 CC lib/log/log_flags.o 00:03:15.669 CC lib/log/log_deprecated.o 00:03:15.669 CC lib/ut/ut.o 00:03:15.669 CC lib/ut_mock/mock.o 00:03:15.669 LIB libspdk_log.a 00:03:15.669 LIB libspdk_ut_mock.a 00:03:15.669 LIB libspdk_ut.a 00:03:15.929 CC lib/ioat/ioat.o 00:03:15.929 CC lib/util/base64.o 00:03:15.929 CC lib/util/bit_array.o 00:03:15.929 CC lib/util/crc16.o 00:03:15.929 CC lib/util/crc32.o 00:03:15.929 CC lib/util/cpuset.o 00:03:15.929 CC lib/dma/dma.o 00:03:15.929 CC lib/util/crc32c.o 00:03:15.929 CC lib/util/crc32_ieee.o 00:03:15.929 CC lib/util/crc64.o 00:03:15.929 CC lib/util/dif.o 00:03:15.929 CC lib/util/fd.o 00:03:15.929 CC lib/util/file.o 00:03:15.929 CC lib/util/hexlify.o 00:03:15.929 CC lib/util/iov.o 00:03:15.929 CC lib/util/math.o 00:03:15.929 CC lib/util/pipe.o 00:03:15.929 CC lib/util/strerror_tls.o 00:03:15.929 CC lib/util/string.o 00:03:15.929 CC lib/util/xor.o 00:03:15.929 CC lib/util/uuid.o 00:03:15.929 CC lib/util/fd_group.o 00:03:15.929 CC lib/util/zipf.o 00:03:15.929 CXX lib/trace_parser/trace.o 00:03:16.188 LIB libspdk_dma.a 00:03:16.188 CC lib/vfio_user/host/vfio_user_pci.o 00:03:16.188 CC lib/vfio_user/host/vfio_user.o 00:03:16.188 LIB libspdk_ioat.a 00:03:16.188 LIB libspdk_util.a 00:03:16.188 LIB libspdk_vfio_user.a 00:03:16.448 LIB libspdk_trace_parser.a 00:03:16.448 CC lib/rdma/common.o 00:03:16.448 CC lib/rdma/rdma_verbs.o 00:03:16.448 CC lib/json/json_parse.o 00:03:16.448 CC lib/json/json_util.o 00:03:16.448 CC lib/json/json_write.o 00:03:16.448 CC lib/env_dpdk/memory.o 00:03:16.448 CC lib/idxd/idxd.o 00:03:16.448 CC lib/idxd/idxd_kernel.o 00:03:16.448 CC lib/env_dpdk/env.o 00:03:16.448 CC lib/idxd/idxd_user.o 00:03:16.448 CC lib/env_dpdk/pci.o 00:03:16.706 CC lib/env_dpdk/init.o 00:03:16.706 CC lib/env_dpdk/threads.o 00:03:16.706 CC lib/env_dpdk/pci_ioat.o 00:03:16.706 CC lib/env_dpdk/pci_virtio.o 00:03:16.706 CC lib/env_dpdk/pci_vmd.o 00:03:16.706 CC lib/env_dpdk/pci_idxd.o 00:03:16.707 CC lib/env_dpdk/pci_event.o 00:03:16.707 CC lib/env_dpdk/sigbus_handler.o 00:03:16.707 CC lib/env_dpdk/pci_dpdk.o 00:03:16.707 CC lib/env_dpdk/pci_dpdk_2211.o 00:03:16.707 CC lib/env_dpdk/pci_dpdk_2207.o 00:03:16.707 CC lib/vmd/vmd.o 00:03:16.707 CC lib/vmd/led.o 00:03:16.707 CC lib/conf/conf.o 00:03:16.707 LIB libspdk_rdma.a 00:03:16.707 LIB libspdk_conf.a 00:03:16.707 LIB libspdk_json.a 00:03:16.964 LIB libspdk_idxd.a 00:03:16.964 LIB libspdk_vmd.a 00:03:16.964 CC lib/jsonrpc/jsonrpc_server.o 00:03:16.964 CC lib/jsonrpc/jsonrpc_server_tcp.o 00:03:16.964 CC lib/jsonrpc/jsonrpc_client_tcp.o 00:03:16.964 CC lib/jsonrpc/jsonrpc_client.o 00:03:17.221 LIB libspdk_jsonrpc.a 00:03:17.480 LIB libspdk_env_dpdk.a 00:03:17.480 CC lib/rpc/rpc.o 00:03:17.738 LIB libspdk_rpc.a 00:03:17.996 CC lib/sock/sock.o 00:03:17.996 CC lib/sock/sock_rpc.o 00:03:17.996 CC lib/notify/notify.o 00:03:17.996 CC lib/trace/trace_rpc.o 00:03:17.996 CC lib/trace/trace.o 00:03:17.996 CC lib/notify/notify_rpc.o 00:03:17.996 CC lib/trace/trace_flags.o 00:03:18.255 LIB libspdk_notify.a 00:03:18.255 LIB libspdk_trace.a 00:03:18.255 LIB libspdk_sock.a 00:03:18.513 CC lib/thread/thread.o 00:03:18.513 CC lib/thread/iobuf.o 00:03:18.513 CC lib/nvme/nvme_ctrlr_cmd.o 00:03:18.513 CC lib/nvme/nvme_ctrlr.o 00:03:18.513 CC lib/nvme/nvme_fabric.o 00:03:18.513 CC lib/nvme/nvme_ns.o 00:03:18.513 CC lib/nvme/nvme_ns_cmd.o 00:03:18.513 CC lib/nvme/nvme_qpair.o 00:03:18.513 CC lib/nvme/nvme_pcie_common.o 00:03:18.513 CC lib/nvme/nvme_pcie.o 00:03:18.513 CC lib/nvme/nvme_quirks.o 00:03:18.513 CC lib/nvme/nvme_transport.o 00:03:18.513 CC lib/nvme/nvme.o 00:03:18.513 CC lib/nvme/nvme_discovery.o 00:03:18.513 CC lib/nvme/nvme_ctrlr_ocssd_cmd.o 00:03:18.513 CC lib/nvme/nvme_ns_ocssd_cmd.o 00:03:18.513 CC lib/nvme/nvme_tcp.o 00:03:18.513 CC lib/nvme/nvme_opal.o 00:03:18.513 CC lib/nvme/nvme_io_msg.o 00:03:18.513 CC lib/nvme/nvme_poll_group.o 00:03:18.513 CC lib/nvme/nvme_zns.o 00:03:18.513 CC lib/nvme/nvme_cuse.o 00:03:18.513 CC lib/nvme/nvme_vfio_user.o 00:03:18.513 CC lib/nvme/nvme_rdma.o 00:03:19.447 LIB libspdk_thread.a 00:03:19.447 CC lib/vfu_tgt/tgt_endpoint.o 00:03:19.447 CC lib/vfu_tgt/tgt_rpc.o 00:03:19.705 CC lib/accel/accel.o 00:03:19.705 CC lib/accel/accel_rpc.o 00:03:19.705 CC lib/accel/accel_sw.o 00:03:19.705 CC lib/virtio/virtio_vfio_user.o 00:03:19.705 CC lib/virtio/virtio.o 00:03:19.705 CC lib/virtio/virtio_pci.o 00:03:19.705 CC lib/virtio/virtio_vhost_user.o 00:03:19.705 CC lib/init/json_config.o 00:03:19.705 CC lib/init/rpc.o 00:03:19.705 CC lib/init/subsystem.o 00:03:19.705 CC lib/init/subsystem_rpc.o 00:03:19.705 CC lib/blob/blobstore.o 00:03:19.705 CC lib/blob/blob_bs_dev.o 00:03:19.705 CC lib/blob/request.o 00:03:19.705 CC lib/blob/zeroes.o 00:03:19.705 LIB libspdk_init.a 00:03:19.705 LIB libspdk_vfu_tgt.a 00:03:19.705 LIB libspdk_nvme.a 00:03:19.705 LIB libspdk_virtio.a 00:03:19.964 CC lib/event/reactor.o 00:03:19.964 CC lib/event/log_rpc.o 00:03:19.964 CC lib/event/app.o 00:03:19.964 CC lib/event/app_rpc.o 00:03:19.964 CC lib/event/scheduler_static.o 00:03:20.223 LIB libspdk_accel.a 00:03:20.223 LIB libspdk_event.a 00:03:20.482 CC lib/bdev/bdev.o 00:03:20.482 CC lib/bdev/part.o 00:03:20.482 CC lib/bdev/bdev_rpc.o 00:03:20.482 CC lib/bdev/bdev_zone.o 00:03:20.482 CC lib/bdev/scsi_nvme.o 00:03:21.050 LIB libspdk_blob.a 00:03:21.622 CC lib/lvol/lvol.o 00:03:21.622 CC lib/blobfs/blobfs.o 00:03:21.622 CC lib/blobfs/tree.o 00:03:21.881 LIB libspdk_lvol.a 00:03:21.881 LIB libspdk_blobfs.a 00:03:22.139 LIB libspdk_bdev.a 00:03:22.398 CC lib/nbd/nbd.o 00:03:22.398 CC lib/nbd/nbd_rpc.o 00:03:22.398 CC lib/ublk/ublk.o 00:03:22.398 CC lib/ublk/ublk_rpc.o 00:03:22.656 CC lib/scsi/dev.o 00:03:22.656 CC lib/scsi/lun.o 00:03:22.656 CC lib/scsi/scsi.o 00:03:22.656 CC lib/scsi/scsi_bdev.o 00:03:22.656 CC lib/ftl/ftl_core.o 00:03:22.656 CC lib/ftl/ftl_layout.o 00:03:22.656 CC lib/scsi/port.o 00:03:22.656 CC lib/ftl/ftl_init.o 00:03:22.656 CC lib/ftl/ftl_debug.o 00:03:22.656 CC lib/scsi/scsi_pr.o 00:03:22.656 CC lib/ftl/ftl_io.o 00:03:22.656 CC lib/scsi/scsi_rpc.o 00:03:22.656 CC lib/ftl/ftl_sb.o 00:03:22.656 CC lib/scsi/task.o 00:03:22.656 CC lib/ftl/ftl_l2p.o 00:03:22.656 CC lib/nvmf/ctrlr.o 00:03:22.656 CC lib/ftl/ftl_l2p_flat.o 00:03:22.656 CC lib/ftl/ftl_nv_cache.o 00:03:22.656 CC lib/nvmf/ctrlr_discovery.o 00:03:22.656 CC lib/ftl/ftl_band.o 00:03:22.656 CC lib/nvmf/ctrlr_bdev.o 00:03:22.656 CC lib/ftl/ftl_band_ops.o 00:03:22.656 CC lib/nvmf/subsystem.o 00:03:22.656 CC lib/nvmf/nvmf.o 00:03:22.656 CC lib/ftl/ftl_writer.o 00:03:22.656 CC lib/nvmf/nvmf_rpc.o 00:03:22.656 CC lib/ftl/ftl_rq.o 00:03:22.656 CC lib/nvmf/transport.o 00:03:22.656 CC lib/ftl/ftl_reloc.o 00:03:22.656 CC lib/nvmf/tcp.o 00:03:22.656 CC lib/ftl/ftl_l2p_cache.o 00:03:22.656 CC lib/ftl/mngt/ftl_mngt.o 00:03:22.656 CC lib/nvmf/vfio_user.o 00:03:22.656 CC lib/ftl/ftl_p2l.o 00:03:22.656 CC lib/nvmf/rdma.o 00:03:22.656 CC lib/ftl/mngt/ftl_mngt_bdev.o 00:03:22.656 CC lib/ftl/mngt/ftl_mngt_shutdown.o 00:03:22.656 CC lib/ftl/mngt/ftl_mngt_startup.o 00:03:22.656 CC lib/ftl/mngt/ftl_mngt_md.o 00:03:22.656 CC lib/ftl/mngt/ftl_mngt_misc.o 00:03:22.656 CC lib/ftl/mngt/ftl_mngt_ioch.o 00:03:22.656 CC lib/ftl/mngt/ftl_mngt_l2p.o 00:03:22.656 CC lib/ftl/mngt/ftl_mngt_p2l.o 00:03:22.656 CC lib/ftl/mngt/ftl_mngt_band.o 00:03:22.656 CC lib/ftl/mngt/ftl_mngt_self_test.o 00:03:22.656 CC lib/ftl/mngt/ftl_mngt_recovery.o 00:03:22.656 CC lib/ftl/mngt/ftl_mngt_upgrade.o 00:03:22.656 CC lib/ftl/utils/ftl_conf.o 00:03:22.657 CC lib/ftl/utils/ftl_md.o 00:03:22.657 CC lib/ftl/utils/ftl_mempool.o 00:03:22.657 CC lib/ftl/utils/ftl_bitmap.o 00:03:22.657 CC lib/ftl/utils/ftl_property.o 00:03:22.657 CC lib/ftl/utils/ftl_layout_tracker_bdev.o 00:03:22.657 CC lib/ftl/upgrade/ftl_layout_upgrade.o 00:03:22.657 CC lib/ftl/upgrade/ftl_sb_upgrade.o 00:03:22.657 CC lib/ftl/upgrade/ftl_p2l_upgrade.o 00:03:22.657 CC lib/ftl/upgrade/ftl_band_upgrade.o 00:03:22.657 CC lib/ftl/upgrade/ftl_chunk_upgrade.o 00:03:22.657 CC lib/ftl/upgrade/ftl_sb_v3.o 00:03:22.657 CC lib/ftl/upgrade/ftl_sb_v5.o 00:03:22.657 CC lib/ftl/nvc/ftl_nvc_dev.o 00:03:22.657 CC lib/ftl/nvc/ftl_nvc_bdev_vss.o 00:03:22.657 CC lib/ftl/base/ftl_base_dev.o 00:03:22.657 CC lib/ftl/ftl_trace.o 00:03:22.657 CC lib/ftl/base/ftl_base_bdev.o 00:03:22.915 LIB libspdk_nbd.a 00:03:22.915 LIB libspdk_ublk.a 00:03:22.915 LIB libspdk_scsi.a 00:03:23.173 LIB libspdk_ftl.a 00:03:23.174 CC lib/iscsi/conn.o 00:03:23.174 CC lib/iscsi/init_grp.o 00:03:23.174 CC lib/iscsi/md5.o 00:03:23.174 CC lib/iscsi/iscsi.o 00:03:23.174 CC lib/iscsi/param.o 00:03:23.174 CC lib/iscsi/portal_grp.o 00:03:23.174 CC lib/iscsi/iscsi_rpc.o 00:03:23.174 CC lib/iscsi/tgt_node.o 00:03:23.174 CC lib/iscsi/iscsi_subsystem.o 00:03:23.174 CC lib/iscsi/task.o 00:03:23.433 CC lib/vhost/vhost_rpc.o 00:03:23.433 CC lib/vhost/vhost_scsi.o 00:03:23.433 CC lib/vhost/vhost.o 00:03:23.433 CC lib/vhost/vhost_blk.o 00:03:23.433 CC lib/vhost/rte_vhost_user.o 00:03:23.765 LIB libspdk_nvmf.a 00:03:24.023 LIB libspdk_vhost.a 00:03:24.023 LIB libspdk_iscsi.a 00:03:24.590 CC module/vfu_device/vfu_virtio.o 00:03:24.590 CC module/env_dpdk/env_dpdk_rpc.o 00:03:24.590 CC module/vfu_device/vfu_virtio_blk.o 00:03:24.590 CC module/vfu_device/vfu_virtio_scsi.o 00:03:24.590 CC module/vfu_device/vfu_virtio_rpc.o 00:03:24.590 CC module/accel/ioat/accel_ioat.o 00:03:24.590 CC module/accel/ioat/accel_ioat_rpc.o 00:03:24.590 LIB libspdk_env_dpdk_rpc.a 00:03:24.590 CC module/accel/error/accel_error_rpc.o 00:03:24.590 CC module/accel/error/accel_error.o 00:03:24.590 CC module/scheduler/dpdk_governor/dpdk_governor.o 00:03:24.590 CC module/accel/dsa/accel_dsa_rpc.o 00:03:24.590 CC module/accel/dsa/accel_dsa.o 00:03:24.590 CC module/sock/posix/posix.o 00:03:24.590 CC module/blob/bdev/blob_bdev.o 00:03:24.590 CC module/accel/iaa/accel_iaa.o 00:03:24.591 CC module/accel/iaa/accel_iaa_rpc.o 00:03:24.591 CC module/scheduler/dynamic/scheduler_dynamic.o 00:03:24.591 CC module/scheduler/gscheduler/gscheduler.o 00:03:24.591 LIB libspdk_scheduler_dpdk_governor.a 00:03:24.591 LIB libspdk_accel_ioat.a 00:03:24.848 LIB libspdk_accel_error.a 00:03:24.848 LIB libspdk_scheduler_gscheduler.a 00:03:24.848 LIB libspdk_scheduler_dynamic.a 00:03:24.848 LIB libspdk_accel_iaa.a 00:03:24.848 LIB libspdk_accel_dsa.a 00:03:24.848 LIB libspdk_blob_bdev.a 00:03:24.848 LIB libspdk_vfu_device.a 00:03:25.107 LIB libspdk_sock_posix.a 00:03:25.107 CC module/bdev/virtio/bdev_virtio_scsi.o 00:03:25.107 CC module/bdev/virtio/bdev_virtio_blk.o 00:03:25.107 CC module/bdev/virtio/bdev_virtio_rpc.o 00:03:25.107 CC module/bdev/nvme/bdev_nvme.o 00:03:25.107 CC module/bdev/nvme/nvme_rpc.o 00:03:25.107 CC module/bdev/nvme/bdev_nvme_rpc.o 00:03:25.107 CC module/bdev/nvme/bdev_mdns_client.o 00:03:25.107 CC module/bdev/nvme/vbdev_opal_rpc.o 00:03:25.107 CC module/bdev/nvme/vbdev_opal.o 00:03:25.107 CC module/bdev/nvme/bdev_nvme_cuse_rpc.o 00:03:25.107 CC module/bdev/gpt/gpt.o 00:03:25.107 CC module/bdev/lvol/vbdev_lvol_rpc.o 00:03:25.107 CC module/bdev/gpt/vbdev_gpt.o 00:03:25.107 CC module/bdev/lvol/vbdev_lvol.o 00:03:25.107 CC module/bdev/delay/vbdev_delay_rpc.o 00:03:25.107 CC module/bdev/delay/vbdev_delay.o 00:03:25.107 CC module/bdev/error/vbdev_error.o 00:03:25.107 CC module/bdev/iscsi/bdev_iscsi.o 00:03:25.107 CC module/bdev/zone_block/vbdev_zone_block.o 00:03:25.107 CC module/bdev/error/vbdev_error_rpc.o 00:03:25.107 CC module/bdev/malloc/bdev_malloc.o 00:03:25.107 CC module/bdev/iscsi/bdev_iscsi_rpc.o 00:03:25.107 CC module/bdev/zone_block/vbdev_zone_block_rpc.o 00:03:25.107 CC module/bdev/malloc/bdev_malloc_rpc.o 00:03:25.107 CC module/bdev/passthru/vbdev_passthru.o 00:03:25.107 CC module/bdev/null/bdev_null_rpc.o 00:03:25.107 CC module/bdev/passthru/vbdev_passthru_rpc.o 00:03:25.107 CC module/bdev/null/bdev_null.o 00:03:25.107 CC module/bdev/aio/bdev_aio.o 00:03:25.107 CC module/bdev/aio/bdev_aio_rpc.o 00:03:25.107 CC module/bdev/split/vbdev_split.o 00:03:25.107 CC module/bdev/split/vbdev_split_rpc.o 00:03:25.107 CC module/bdev/ftl/bdev_ftl.o 00:03:25.107 CC module/bdev/ftl/bdev_ftl_rpc.o 00:03:25.107 CC module/bdev/raid/bdev_raid.o 00:03:25.107 CC module/bdev/raid/bdev_raid_sb.o 00:03:25.107 CC module/bdev/raid/bdev_raid_rpc.o 00:03:25.107 CC module/blobfs/bdev/blobfs_bdev.o 00:03:25.107 CC module/bdev/raid/raid1.o 00:03:25.107 CC module/bdev/raid/raid0.o 00:03:25.107 CC module/bdev/raid/concat.o 00:03:25.107 CC module/blobfs/bdev/blobfs_bdev_rpc.o 00:03:25.366 LIB libspdk_blobfs_bdev.a 00:03:25.366 LIB libspdk_bdev_gpt.a 00:03:25.366 LIB libspdk_bdev_error.a 00:03:25.366 LIB libspdk_bdev_split.a 00:03:25.366 LIB libspdk_bdev_null.a 00:03:25.366 LIB libspdk_bdev_ftl.a 00:03:25.366 LIB libspdk_bdev_aio.a 00:03:25.366 LIB libspdk_bdev_passthru.a 00:03:25.366 LIB libspdk_bdev_zone_block.a 00:03:25.366 LIB libspdk_bdev_iscsi.a 00:03:25.366 LIB libspdk_bdev_malloc.a 00:03:25.366 LIB libspdk_bdev_delay.a 00:03:25.625 LIB libspdk_bdev_lvol.a 00:03:25.625 LIB libspdk_bdev_virtio.a 00:03:25.625 LIB libspdk_bdev_raid.a 00:03:26.563 LIB libspdk_bdev_nvme.a 00:03:27.130 CC module/event/subsystems/vmd/vmd.o 00:03:27.130 CC module/event/subsystems/vmd/vmd_rpc.o 00:03:27.130 CC module/event/subsystems/scheduler/scheduler.o 00:03:27.130 CC module/event/subsystems/vfu_tgt/vfu_tgt.o 00:03:27.130 CC module/event/subsystems/iobuf/iobuf.o 00:03:27.130 CC module/event/subsystems/iobuf/iobuf_rpc.o 00:03:27.130 CC module/event/subsystems/sock/sock.o 00:03:27.130 CC module/event/subsystems/vhost_blk/vhost_blk.o 00:03:27.130 LIB libspdk_event_vmd.a 00:03:27.130 LIB libspdk_event_scheduler.a 00:03:27.130 LIB libspdk_event_vfu_tgt.a 00:03:27.130 LIB libspdk_event_sock.a 00:03:27.130 LIB libspdk_event_vhost_blk.a 00:03:27.130 LIB libspdk_event_iobuf.a 00:03:27.389 CC module/event/subsystems/accel/accel.o 00:03:27.648 LIB libspdk_event_accel.a 00:03:27.907 CC module/event/subsystems/bdev/bdev.o 00:03:27.907 LIB libspdk_event_bdev.a 00:03:28.165 CC module/event/subsystems/ublk/ublk.o 00:03:28.165 CC module/event/subsystems/scsi/scsi.o 00:03:28.425 CC module/event/subsystems/nbd/nbd.o 00:03:28.425 CC module/event/subsystems/nvmf/nvmf_rpc.o 00:03:28.425 CC module/event/subsystems/nvmf/nvmf_tgt.o 00:03:28.425 LIB libspdk_event_ublk.a 00:03:28.425 LIB libspdk_event_scsi.a 00:03:28.425 LIB libspdk_event_nbd.a 00:03:28.425 LIB libspdk_event_nvmf.a 00:03:28.684 CC module/event/subsystems/iscsi/iscsi.o 00:03:28.684 CC module/event/subsystems/vhost_scsi/vhost_scsi.o 00:03:28.684 LIB libspdk_event_iscsi.a 00:03:28.943 LIB libspdk_event_vhost_scsi.a 00:03:29.213 CC test/rpc_client/rpc_client_test.o 00:03:29.213 CXX app/trace/trace.o 00:03:29.213 CC app/spdk_lspci/spdk_lspci.o 00:03:29.213 CC app/spdk_nvme_discover/discovery_aer.o 00:03:29.213 CC app/spdk_nvme_identify/identify.o 00:03:29.213 TEST_HEADER include/spdk/accel.h 00:03:29.213 TEST_HEADER include/spdk/accel_module.h 00:03:29.213 TEST_HEADER include/spdk/assert.h 00:03:29.213 TEST_HEADER include/spdk/barrier.h 00:03:29.213 TEST_HEADER include/spdk/base64.h 00:03:29.213 TEST_HEADER include/spdk/bdev.h 00:03:29.213 TEST_HEADER include/spdk/bdev_zone.h 00:03:29.213 CC app/spdk_top/spdk_top.o 00:03:29.213 TEST_HEADER include/spdk/bdev_module.h 00:03:29.213 TEST_HEADER include/spdk/bit_array.h 00:03:29.213 TEST_HEADER include/spdk/bit_pool.h 00:03:29.213 TEST_HEADER include/spdk/blob_bdev.h 00:03:29.213 TEST_HEADER include/spdk/blobfs_bdev.h 00:03:29.213 CC app/spdk_nvme_perf/perf.o 00:03:29.213 CC app/trace_record/trace_record.o 00:03:29.213 TEST_HEADER include/spdk/blobfs.h 00:03:29.213 TEST_HEADER include/spdk/blob.h 00:03:29.213 TEST_HEADER include/spdk/conf.h 00:03:29.213 TEST_HEADER include/spdk/config.h 00:03:29.213 TEST_HEADER include/spdk/cpuset.h 00:03:29.213 TEST_HEADER include/spdk/crc16.h 00:03:29.213 TEST_HEADER include/spdk/crc32.h 00:03:29.213 TEST_HEADER include/spdk/crc64.h 00:03:29.213 TEST_HEADER include/spdk/dif.h 00:03:29.213 TEST_HEADER include/spdk/dma.h 00:03:29.213 TEST_HEADER include/spdk/endian.h 00:03:29.213 TEST_HEADER include/spdk/env_dpdk.h 00:03:29.213 TEST_HEADER include/spdk/event.h 00:03:29.213 TEST_HEADER include/spdk/env.h 00:03:29.213 CC app/iscsi_tgt/iscsi_tgt.o 00:03:29.213 TEST_HEADER include/spdk/fd_group.h 00:03:29.213 TEST_HEADER include/spdk/fd.h 00:03:29.213 TEST_HEADER include/spdk/ftl.h 00:03:29.213 TEST_HEADER include/spdk/file.h 00:03:29.213 TEST_HEADER include/spdk/gpt_spec.h 00:03:29.213 TEST_HEADER include/spdk/hexlify.h 00:03:29.213 TEST_HEADER include/spdk/histogram_data.h 00:03:29.213 TEST_HEADER include/spdk/idxd.h 00:03:29.213 TEST_HEADER include/spdk/idxd_spec.h 00:03:29.213 TEST_HEADER include/spdk/ioat.h 00:03:29.213 TEST_HEADER include/spdk/init.h 00:03:29.213 CC examples/interrupt_tgt/interrupt_tgt.o 00:03:29.213 TEST_HEADER include/spdk/iscsi_spec.h 00:03:29.213 TEST_HEADER include/spdk/ioat_spec.h 00:03:29.213 TEST_HEADER include/spdk/json.h 00:03:29.213 TEST_HEADER include/spdk/jsonrpc.h 00:03:29.213 TEST_HEADER include/spdk/likely.h 00:03:29.213 TEST_HEADER include/spdk/log.h 00:03:29.213 TEST_HEADER include/spdk/memory.h 00:03:29.213 TEST_HEADER include/spdk/lvol.h 00:03:29.213 TEST_HEADER include/spdk/mmio.h 00:03:29.213 TEST_HEADER include/spdk/notify.h 00:03:29.213 TEST_HEADER include/spdk/nbd.h 00:03:29.213 TEST_HEADER include/spdk/nvme.h 00:03:29.213 TEST_HEADER include/spdk/nvme_intel.h 00:03:29.213 TEST_HEADER include/spdk/nvme_ocssd.h 00:03:29.213 TEST_HEADER include/spdk/nvme_ocssd_spec.h 00:03:29.213 TEST_HEADER include/spdk/nvme_spec.h 00:03:29.213 CC app/spdk_dd/spdk_dd.o 00:03:29.213 TEST_HEADER include/spdk/nvme_zns.h 00:03:29.213 TEST_HEADER include/spdk/nvmf_cmd.h 00:03:29.213 TEST_HEADER include/spdk/nvmf_fc_spec.h 00:03:29.213 TEST_HEADER include/spdk/nvmf.h 00:03:29.213 TEST_HEADER include/spdk/nvmf_spec.h 00:03:29.213 TEST_HEADER include/spdk/nvmf_transport.h 00:03:29.213 TEST_HEADER include/spdk/opal.h 00:03:29.213 TEST_HEADER include/spdk/opal_spec.h 00:03:29.213 TEST_HEADER include/spdk/pci_ids.h 00:03:29.213 TEST_HEADER include/spdk/pipe.h 00:03:29.213 TEST_HEADER include/spdk/queue.h 00:03:29.213 TEST_HEADER include/spdk/reduce.h 00:03:29.213 TEST_HEADER include/spdk/rpc.h 00:03:29.213 TEST_HEADER include/spdk/scheduler.h 00:03:29.213 TEST_HEADER include/spdk/scsi.h 00:03:29.213 TEST_HEADER include/spdk/scsi_spec.h 00:03:29.213 TEST_HEADER include/spdk/sock.h 00:03:29.213 TEST_HEADER include/spdk/stdinc.h 00:03:29.213 TEST_HEADER include/spdk/thread.h 00:03:29.213 TEST_HEADER include/spdk/string.h 00:03:29.213 TEST_HEADER include/spdk/trace.h 00:03:29.213 CC app/spdk_tgt/spdk_tgt.o 00:03:29.213 CC app/vhost/vhost.o 00:03:29.213 TEST_HEADER include/spdk/trace_parser.h 00:03:29.213 TEST_HEADER include/spdk/ublk.h 00:03:29.213 TEST_HEADER include/spdk/tree.h 00:03:29.213 TEST_HEADER include/spdk/util.h 00:03:29.213 CC app/nvmf_tgt/nvmf_main.o 00:03:29.213 TEST_HEADER include/spdk/uuid.h 00:03:29.213 TEST_HEADER include/spdk/version.h 00:03:29.213 TEST_HEADER include/spdk/vfio_user_pci.h 00:03:29.213 TEST_HEADER include/spdk/vfio_user_spec.h 00:03:29.213 TEST_HEADER include/spdk/vmd.h 00:03:29.213 TEST_HEADER include/spdk/vhost.h 00:03:29.213 TEST_HEADER include/spdk/zipf.h 00:03:29.213 TEST_HEADER include/spdk/xor.h 00:03:29.213 CXX test/cpp_headers/accel_module.o 00:03:29.213 CXX test/cpp_headers/accel.o 00:03:29.213 CXX test/cpp_headers/assert.o 00:03:29.213 CXX test/cpp_headers/barrier.o 00:03:29.213 CXX test/cpp_headers/base64.o 00:03:29.213 CC test/app/histogram_perf/histogram_perf.o 00:03:29.213 CXX test/cpp_headers/bdev.o 00:03:29.213 CXX test/cpp_headers/bdev_module.o 00:03:29.213 CXX test/cpp_headers/bdev_zone.o 00:03:29.213 CXX test/cpp_headers/bit_array.o 00:03:29.213 CXX test/cpp_headers/bit_pool.o 00:03:29.213 CXX test/cpp_headers/blob_bdev.o 00:03:29.213 CXX test/cpp_headers/blobfs_bdev.o 00:03:29.213 CXX test/cpp_headers/blobfs.o 00:03:29.213 CXX test/cpp_headers/blob.o 00:03:29.213 CXX test/cpp_headers/config.o 00:03:29.213 CXX test/cpp_headers/conf.o 00:03:29.213 CXX test/cpp_headers/cpuset.o 00:03:29.213 CC test/app/jsoncat/jsoncat.o 00:03:29.213 CXX test/cpp_headers/crc16.o 00:03:29.213 CXX test/cpp_headers/crc32.o 00:03:29.213 CXX test/cpp_headers/crc64.o 00:03:29.213 CXX test/cpp_headers/dif.o 00:03:29.213 CC test/event/reactor_perf/reactor_perf.o 00:03:29.213 CXX test/cpp_headers/dma.o 00:03:29.213 CXX test/cpp_headers/endian.o 00:03:29.213 CXX test/cpp_headers/env_dpdk.o 00:03:29.213 CC examples/idxd/perf/perf.o 00:03:29.213 CXX test/cpp_headers/env.o 00:03:29.213 CXX test/cpp_headers/event.o 00:03:29.213 CC test/app/stub/stub.o 00:03:29.213 CC test/event/reactor/reactor.o 00:03:29.213 CXX test/cpp_headers/fd_group.o 00:03:29.213 CXX test/cpp_headers/fd.o 00:03:29.213 CXX test/cpp_headers/file.o 00:03:29.213 CXX test/cpp_headers/ftl.o 00:03:29.213 CXX test/cpp_headers/gpt_spec.o 00:03:29.213 CC examples/accel/perf/accel_perf.o 00:03:29.213 CXX test/cpp_headers/hexlify.o 00:03:29.213 CC examples/ioat/verify/verify.o 00:03:29.213 CC examples/util/zipf/zipf.o 00:03:29.213 CC test/event/event_perf/event_perf.o 00:03:29.213 CC examples/vmd/lsvmd/lsvmd.o 00:03:29.213 CC test/env/memory/memory_ut.o 00:03:29.213 CC examples/ioat/perf/perf.o 00:03:29.213 CC test/nvme/aer/aer.o 00:03:29.213 CC test/nvme/reset/reset.o 00:03:29.213 CC test/event/app_repeat/app_repeat.o 00:03:29.213 CC test/nvme/overhead/overhead.o 00:03:29.213 CC test/nvme/simple_copy/simple_copy.o 00:03:29.213 CC test/nvme/sgl/sgl.o 00:03:29.213 CC test/env/vtophys/vtophys.o 00:03:29.213 CC test/nvme/boot_partition/boot_partition.o 00:03:29.213 CC test/nvme/err_injection/err_injection.o 00:03:29.213 CC test/env/pci/pci_ut.o 00:03:29.213 CC examples/nvme/hello_world/hello_world.o 00:03:29.213 CC examples/nvme/nvme_manage/nvme_manage.o 00:03:29.213 CC test/nvme/startup/startup.o 00:03:29.213 CC test/nvme/compliance/nvme_compliance.o 00:03:29.213 CC examples/vmd/led/led.o 00:03:29.213 CC test/nvme/reserve/reserve.o 00:03:29.213 CC test/env/env_dpdk_post_init/env_dpdk_post_init.o 00:03:29.213 CC test/nvme/e2edp/nvme_dp.o 00:03:29.213 CC examples/sock/hello_world/hello_sock.o 00:03:29.213 CC examples/nvme/abort/abort.o 00:03:29.213 CC test/nvme/doorbell_aers/doorbell_aers.o 00:03:29.213 CC test/nvme/connect_stress/connect_stress.o 00:03:29.213 CC test/bdev/bdevio/bdevio.o 00:03:29.213 CC test/nvme/cuse/cuse.o 00:03:29.213 CC examples/nvme/pmr_persistence/pmr_persistence.o 00:03:29.213 CC app/fio/nvme/fio_plugin.o 00:03:29.213 CC test/nvme/fdp/fdp.o 00:03:29.213 CC test/blobfs/mkfs/mkfs.o 00:03:29.213 CC test/thread/poller_perf/poller_perf.o 00:03:29.213 CC examples/nvme/hotplug/hotplug.o 00:03:29.213 CC test/thread/lock/spdk_lock.o 00:03:29.213 CC test/nvme/fused_ordering/fused_ordering.o 00:03:29.213 CC examples/nvme/cmb_copy/cmb_copy.o 00:03:29.477 CC examples/nvme/reconnect/reconnect.o 00:03:29.477 CC examples/nvme/arbitration/arbitration.o 00:03:29.477 CC test/app/bdev_svc/bdev_svc.o 00:03:29.477 CC examples/blob/cli/blobcli.o 00:03:29.477 CC test/dma/test_dma/test_dma.o 00:03:29.477 CC examples/blob/hello_world/hello_blob.o 00:03:29.477 CC examples/nvmf/nvmf/nvmf.o 00:03:29.477 CXX test/cpp_headers/histogram_data.o 00:03:29.477 CC test/event/scheduler/scheduler.o 00:03:29.477 CC examples/thread/thread/thread_ex.o 00:03:29.477 CC examples/bdev/hello_world/hello_bdev.o 00:03:29.477 CC test/accel/dif/dif.o 00:03:29.477 LINK spdk_lspci 00:03:29.477 CC examples/bdev/bdevperf/bdevperf.o 00:03:29.477 CC app/fio/bdev/fio_plugin.o 00:03:29.477 LINK rpc_client_test 00:03:29.477 LINK spdk_nvme_discover 00:03:29.477 CC test/lvol/esnap/esnap.o 00:03:29.477 CC test/app/fuzz/vhost_fuzz/vhost_fuzz_rpc.o 00:03:29.477 CC test/env/mem_callbacks/mem_callbacks.o 00:03:29.477 CC test/app/fuzz/iscsi_fuzz/iscsi_fuzz.o 00:03:29.478 CC test/app/fuzz/nvme_fuzz/nvme_fuzz.o 00:03:29.478 LINK interrupt_tgt 00:03:29.478 LINK histogram_perf 00:03:29.478 LINK jsoncat 00:03:29.478 LINK reactor_perf 00:03:29.478 LINK lsvmd 00:03:29.478 LINK reactor 00:03:29.478 CXX test/cpp_headers/idxd.o 00:03:29.478 LINK iscsi_tgt 00:03:29.478 LINK spdk_trace_record 00:03:29.478 LINK led 00:03:29.478 CC test/app/fuzz/vhost_fuzz/vhost_fuzz.o 00:03:29.478 LINK zipf 00:03:29.478 LINK nvmf_tgt 00:03:29.478 CC test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.o 00:03:29.478 CXX test/cpp_headers/idxd_spec.o 00:03:29.478 CC test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.o 00:03:29.478 CXX test/cpp_headers/init.o 00:03:29.478 LINK event_perf 00:03:29.478 CXX test/cpp_headers/ioat.o 00:03:29.478 CXX test/cpp_headers/ioat_spec.o 00:03:29.478 CXX test/cpp_headers/iscsi_spec.o 00:03:29.478 LINK vtophys 00:03:29.478 CXX test/cpp_headers/json.o 00:03:29.478 CXX test/cpp_headers/jsonrpc.o 00:03:29.478 LINK vhost 00:03:29.478 CXX test/cpp_headers/likely.o 00:03:29.478 CXX test/cpp_headers/log.o 00:03:29.478 LINK app_repeat 00:03:29.478 CXX test/cpp_headers/lvol.o 00:03:29.478 CXX test/cpp_headers/memory.o 00:03:29.478 LINK env_dpdk_post_init 00:03:29.478 CXX test/cpp_headers/mmio.o 00:03:29.478 CXX test/cpp_headers/nbd.o 00:03:29.478 LINK poller_perf 00:03:29.478 CXX test/cpp_headers/notify.o 00:03:29.478 CXX test/cpp_headers/nvme.o 00:03:29.478 CXX test/cpp_headers/nvme_intel.o 00:03:29.478 LINK stub 00:03:29.478 CXX test/cpp_headers/nvme_ocssd.o 00:03:29.478 CXX test/cpp_headers/nvme_spec.o 00:03:29.478 CXX test/cpp_headers/nvme_ocssd_spec.o 00:03:29.478 CXX test/cpp_headers/nvme_zns.o 00:03:29.478 CXX test/cpp_headers/nvmf_cmd.o 00:03:29.478 CXX test/cpp_headers/nvmf_fc_spec.o 00:03:29.478 CXX test/cpp_headers/nvmf.o 00:03:29.478 CXX test/cpp_headers/nvmf_spec.o 00:03:29.478 LINK doorbell_aers 00:03:29.478 CXX test/cpp_headers/nvmf_transport.o 00:03:29.478 CXX test/cpp_headers/opal.o 00:03:29.478 LINK err_injection 00:03:29.478 LINK boot_partition 00:03:29.478 CXX test/cpp_headers/opal_spec.o 00:03:29.478 LINK connect_stress 00:03:29.478 LINK spdk_tgt 00:03:29.478 LINK startup 00:03:29.478 CXX test/cpp_headers/pci_ids.o 00:03:29.478 LINK pmr_persistence 00:03:29.478 CXX test/cpp_headers/pipe.o 00:03:29.478 CXX test/cpp_headers/queue.o 00:03:29.478 LINK bdev_svc 00:03:29.747 fio_plugin.c:1491:29: warning: field 'ruhs' with variable sized type 'struct spdk_nvme_fdp_ruhs' not at the end of a struct or class is a GNU extension [-Wgnu-variable-sized-type-not-at-end] 00:03:29.747 struct spdk_nvme_fdp_ruhs ruhs; 00:03:29.747 ^ 00:03:29.747 CXX test/cpp_headers/reduce.o 00:03:29.747 LINK hello_world 00:03:29.747 LINK reserve 00:03:29.747 LINK mkfs 00:03:29.747 LINK simple_copy 00:03:29.747 CXX test/cpp_headers/rpc.o 00:03:29.747 LINK verify 00:03:29.747 LINK fused_ordering 00:03:29.747 LINK ioat_perf 00:03:29.747 LINK cmb_copy 00:03:29.747 LINK hello_sock 00:03:29.747 CXX test/cpp_headers/scheduler.o 00:03:29.747 LINK hotplug 00:03:29.747 LINK aer 00:03:29.747 LINK hello_blob 00:03:29.747 LINK hello_bdev 00:03:29.747 LINK sgl 00:03:29.747 LINK scheduler 00:03:29.747 LINK thread 00:03:29.747 LINK reset 00:03:29.747 LINK nvme_dp 00:03:29.747 LINK spdk_trace 00:03:29.747 LINK overhead 00:03:29.747 LINK fdp 00:03:29.747 LINK nvmf 00:03:29.747 CXX test/cpp_headers/scsi.o 00:03:29.747 CXX test/cpp_headers/scsi_spec.o 00:03:29.747 CXX test/cpp_headers/sock.o 00:03:29.747 CXX test/cpp_headers/stdinc.o 00:03:29.747 CXX test/cpp_headers/string.o 00:03:29.747 CXX test/cpp_headers/thread.o 00:03:29.747 CXX test/cpp_headers/trace.o 00:03:29.747 CXX test/cpp_headers/trace_parser.o 00:03:29.747 CXX test/cpp_headers/tree.o 00:03:29.747 CXX test/cpp_headers/ublk.o 00:03:29.747 CXX test/cpp_headers/util.o 00:03:29.747 CXX test/cpp_headers/uuid.o 00:03:29.747 CXX test/cpp_headers/version.o 00:03:29.747 LINK reconnect 00:03:29.747 CXX test/cpp_headers/vfio_user_pci.o 00:03:29.747 LINK idxd_perf 00:03:29.747 CXX test/cpp_headers/vfio_user_spec.o 00:03:29.747 LINK abort 00:03:29.747 LINK bdevio 00:03:29.747 CXX test/cpp_headers/vhost.o 00:03:29.747 LINK test_dma 00:03:29.747 CXX test/cpp_headers/vmd.o 00:03:29.747 CXX test/cpp_headers/xor.o 00:03:29.747 CXX test/cpp_headers/zipf.o 00:03:29.747 LINK spdk_dd 00:03:30.008 LINK dif 00:03:30.008 LINK nvme_compliance 00:03:30.008 LINK arbitration 00:03:30.008 LINK accel_perf 00:03:30.008 LINK pci_ut 00:03:30.008 LINK nvme_manage 00:03:30.008 LINK llvm_vfio_fuzz 00:03:30.008 LINK blobcli 00:03:30.008 LINK nvme_fuzz 00:03:30.008 LINK spdk_bdev 00:03:30.008 LINK mem_callbacks 00:03:30.008 LINK vhost_fuzz 00:03:30.267 LINK spdk_nvme_perf 00:03:30.267 1 warning generated. 00:03:30.267 LINK spdk_nvme 00:03:30.267 LINK spdk_nvme_identify 00:03:30.267 LINK bdevperf 00:03:30.267 LINK llvm_nvme_fuzz 00:03:30.267 LINK cuse 00:03:30.267 LINK spdk_top 00:03:30.267 LINK memory_ut 00:03:30.834 LINK iscsi_fuzz 00:03:31.093 LINK spdk_lock 00:03:32.997 LINK esnap 00:03:33.255 00:03:33.255 real 0m23.823s 00:03:33.255 user 4m38.987s 00:03:33.255 sys 1m53.762s 00:03:33.255 10:34:49 -- common/autotest_common.sh@1105 -- $ xtrace_disable 00:03:33.255 10:34:49 -- common/autotest_common.sh@10 -- $ set +x 00:03:33.255 ************************************ 00:03:33.255 END TEST make 00:03:33.255 ************************************ 00:03:33.514 10:34:49 -- spdk/autotest.sh@25 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:03:33.514 10:34:49 -- nvmf/common.sh@7 -- # uname -s 00:03:33.514 10:34:49 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:03:33.514 10:34:49 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:03:33.514 10:34:49 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:03:33.514 10:34:49 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:03:33.514 10:34:49 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:03:33.514 10:34:49 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:03:33.514 10:34:49 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:03:33.514 10:34:49 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:03:33.514 10:34:49 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:03:33.514 10:34:49 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:03:33.514 10:34:49 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809b5fbc-4be7-e711-906e-0017a4403562 00:03:33.514 10:34:49 -- nvmf/common.sh@18 -- # NVME_HOSTID=809b5fbc-4be7-e711-906e-0017a4403562 00:03:33.514 10:34:49 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:03:33.514 10:34:49 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:03:33.514 10:34:49 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:03:33.514 10:34:49 -- nvmf/common.sh@44 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:03:33.514 10:34:49 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:03:33.515 10:34:49 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:03:33.515 10:34:49 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:03:33.515 10:34:49 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:33.515 10:34:49 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:33.515 10:34:49 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:33.515 10:34:49 -- paths/export.sh@5 -- # export PATH 00:03:33.515 10:34:49 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:33.515 10:34:49 -- nvmf/common.sh@46 -- # : 0 00:03:33.515 10:34:49 -- nvmf/common.sh@47 -- # export NVMF_APP_SHM_ID 00:03:33.515 10:34:49 -- nvmf/common.sh@48 -- # build_nvmf_app_args 00:03:33.515 10:34:49 -- nvmf/common.sh@24 -- # '[' 0 -eq 1 ']' 00:03:33.515 10:34:49 -- nvmf/common.sh@28 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:03:33.515 10:34:49 -- nvmf/common.sh@30 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:03:33.515 10:34:49 -- nvmf/common.sh@32 -- # '[' -n '' ']' 00:03:33.515 10:34:49 -- nvmf/common.sh@34 -- # '[' 0 -eq 1 ']' 00:03:33.515 10:34:49 -- nvmf/common.sh@50 -- # have_pci_nics=0 00:03:33.515 10:34:49 -- spdk/autotest.sh@27 -- # '[' 0 -ne 0 ']' 00:03:33.515 10:34:49 -- spdk/autotest.sh@32 -- # uname -s 00:03:33.515 10:34:49 -- spdk/autotest.sh@32 -- # '[' Linux = Linux ']' 00:03:33.515 10:34:49 -- spdk/autotest.sh@33 -- # old_core_pattern='|/usr/lib/systemd/systemd-coredump %P %u %g %s %t %c %h' 00:03:33.515 10:34:49 -- spdk/autotest.sh@34 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/coredumps 00:03:33.515 10:34:49 -- spdk/autotest.sh@39 -- # echo '|/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/core-collector.sh %P %s %t' 00:03:33.515 10:34:49 -- spdk/autotest.sh@40 -- # echo /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/coredumps 00:03:33.515 10:34:49 -- spdk/autotest.sh@44 -- # modprobe nbd 00:03:33.515 10:34:49 -- spdk/autotest.sh@46 -- # type -P udevadm 00:03:33.515 10:34:49 -- spdk/autotest.sh@46 -- # udevadm=/usr/sbin/udevadm 00:03:33.515 10:34:49 -- spdk/autotest.sh@48 -- # udevadm_pid=1919614 00:03:33.515 10:34:49 -- spdk/autotest.sh@51 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:03:33.515 10:34:49 -- spdk/autotest.sh@47 -- # /usr/sbin/udevadm monitor --property 00:03:33.515 10:34:49 -- spdk/autotest.sh@54 -- # echo 1919616 00:03:33.515 10:34:49 -- spdk/autotest.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:03:33.515 10:34:49 -- spdk/autotest.sh@55 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:03:33.515 10:34:49 -- spdk/autotest.sh@56 -- # echo 1919617 00:03:33.515 10:34:49 -- spdk/autotest.sh@58 -- # [[ ............................... != QEMU ]] 00:03:33.515 10:34:49 -- spdk/autotest.sh@60 -- # echo 1919618 00:03:33.515 10:34:49 -- spdk/autotest.sh@59 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l 00:03:33.515 10:34:49 -- spdk/autotest.sh@62 -- # echo 1919619 00:03:33.515 10:34:49 -- spdk/autotest.sh@61 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l 00:03:33.515 10:34:49 -- spdk/autotest.sh@66 -- # trap 'autotest_cleanup || :; exit 1' SIGINT SIGTERM EXIT 00:03:33.515 10:34:49 -- spdk/autotest.sh@68 -- # timing_enter autotest 00:03:33.515 10:34:49 -- common/autotest_common.sh@712 -- # xtrace_disable 00:03:33.515 10:34:49 -- common/autotest_common.sh@10 -- # set +x 00:03:33.515 10:34:49 -- spdk/autotest.sh@70 -- # create_test_list 00:03:33.515 10:34:49 -- common/autotest_common.sh@736 -- # xtrace_disable 00:03:33.515 10:34:49 -- common/autotest_common.sh@10 -- # set +x 00:03:33.515 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-bmc-pm.bmc.pm.log 00:03:33.515 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-cpu-temp.pm.log 00:03:33.515 10:34:49 -- spdk/autotest.sh@72 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/autotest.sh 00:03:33.515 10:34:49 -- spdk/autotest.sh@72 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:03:33.515 10:34:49 -- spdk/autotest.sh@72 -- # src=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:03:33.515 10:34:49 -- spdk/autotest.sh@73 -- # out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:03:33.515 10:34:49 -- spdk/autotest.sh@74 -- # cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:03:33.515 10:34:49 -- spdk/autotest.sh@76 -- # freebsd_update_contigmem_mod 00:03:33.515 10:34:49 -- common/autotest_common.sh@1440 -- # uname 00:03:33.515 10:34:49 -- common/autotest_common.sh@1440 -- # '[' Linux = FreeBSD ']' 00:03:33.515 10:34:49 -- spdk/autotest.sh@77 -- # freebsd_set_maxsock_buf 00:03:33.515 10:34:49 -- common/autotest_common.sh@1460 -- # uname 00:03:33.773 10:34:49 -- common/autotest_common.sh@1460 -- # [[ Linux = FreeBSD ]] 00:03:33.773 10:34:49 -- spdk/autotest.sh@82 -- # grep CC_TYPE mk/cc.mk 00:03:33.773 10:34:49 -- spdk/autotest.sh@82 -- # CC_TYPE=CC_TYPE=clang 00:03:33.773 10:34:49 -- spdk/autotest.sh@83 -- # hash lcov 00:03:33.773 10:34:49 -- spdk/autotest.sh@83 -- # [[ CC_TYPE=clang == *\c\l\a\n\g* ]] 00:03:33.773 10:34:49 -- spdk/autotest.sh@100 -- # timing_enter pre_cleanup 00:03:33.773 10:34:49 -- common/autotest_common.sh@712 -- # xtrace_disable 00:03:33.773 10:34:49 -- common/autotest_common.sh@10 -- # set +x 00:03:33.773 10:34:49 -- spdk/autotest.sh@102 -- # rm -f 00:03:33.773 10:34:49 -- spdk/autotest.sh@105 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:03:37.061 0000:00:04.7 (8086 2021): Already using the ioatdma driver 00:03:37.061 0000:00:04.6 (8086 2021): Already using the ioatdma driver 00:03:37.061 0000:00:04.5 (8086 2021): Already using the ioatdma driver 00:03:37.061 0000:00:04.4 (8086 2021): Already using the ioatdma driver 00:03:37.061 0000:00:04.3 (8086 2021): Already using the ioatdma driver 00:03:37.319 0000:00:04.2 (8086 2021): Already using the ioatdma driver 00:03:37.319 0000:00:04.1 (8086 2021): Already using the ioatdma driver 00:03:37.319 0000:00:04.0 (8086 2021): Already using the ioatdma driver 00:03:37.319 0000:80:04.7 (8086 2021): Already using the ioatdma driver 00:03:37.319 0000:80:04.6 (8086 2021): Already using the ioatdma driver 00:03:37.320 0000:80:04.5 (8086 2021): Already using the ioatdma driver 00:03:37.320 0000:80:04.4 (8086 2021): Already using the ioatdma driver 00:03:37.320 0000:80:04.3 (8086 2021): Already using the ioatdma driver 00:03:37.320 0000:80:04.2 (8086 2021): Already using the ioatdma driver 00:03:37.578 0000:80:04.1 (8086 2021): Already using the ioatdma driver 00:03:37.578 0000:80:04.0 (8086 2021): Already using the ioatdma driver 00:03:37.578 0000:d8:00.0 (8086 0a54): Already using the nvme driver 00:03:37.578 10:34:53 -- spdk/autotest.sh@107 -- # get_zoned_devs 00:03:37.578 10:34:53 -- common/autotest_common.sh@1654 -- # zoned_devs=() 00:03:37.578 10:34:53 -- common/autotest_common.sh@1654 -- # local -gA zoned_devs 00:03:37.578 10:34:53 -- common/autotest_common.sh@1655 -- # local nvme bdf 00:03:37.578 10:34:53 -- common/autotest_common.sh@1657 -- # for nvme in /sys/block/nvme* 00:03:37.578 10:34:53 -- common/autotest_common.sh@1658 -- # is_block_zoned nvme0n1 00:03:37.578 10:34:53 -- common/autotest_common.sh@1647 -- # local device=nvme0n1 00:03:37.578 10:34:53 -- common/autotest_common.sh@1649 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:03:37.578 10:34:53 -- common/autotest_common.sh@1650 -- # [[ none != none ]] 00:03:37.578 10:34:53 -- spdk/autotest.sh@109 -- # (( 0 > 0 )) 00:03:37.578 10:34:53 -- spdk/autotest.sh@121 -- # ls /dev/nvme0n1 00:03:37.578 10:34:53 -- spdk/autotest.sh@121 -- # grep -v p 00:03:37.578 10:34:53 -- spdk/autotest.sh@121 -- # for dev in $(ls /dev/nvme*n* | grep -v p || true) 00:03:37.578 10:34:53 -- spdk/autotest.sh@123 -- # [[ -z '' ]] 00:03:37.578 10:34:53 -- spdk/autotest.sh@124 -- # block_in_use /dev/nvme0n1 00:03:37.578 10:34:53 -- scripts/common.sh@380 -- # local block=/dev/nvme0n1 pt 00:03:37.578 10:34:53 -- scripts/common.sh@389 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/spdk-gpt.py /dev/nvme0n1 00:03:37.578 No valid GPT data, bailing 00:03:37.578 10:34:53 -- scripts/common.sh@393 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:03:37.578 10:34:53 -- scripts/common.sh@393 -- # pt= 00:03:37.578 10:34:53 -- scripts/common.sh@394 -- # return 1 00:03:37.578 10:34:53 -- spdk/autotest.sh@125 -- # dd if=/dev/zero of=/dev/nvme0n1 bs=1M count=1 00:03:37.578 1+0 records in 00:03:37.578 1+0 records out 00:03:37.578 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00188093 s, 557 MB/s 00:03:37.578 10:34:53 -- spdk/autotest.sh@129 -- # sync 00:03:37.578 10:34:53 -- spdk/autotest.sh@131 -- # xtrace_disable_per_cmd reap_spdk_processes 00:03:37.578 10:34:53 -- common/autotest_common.sh@22 -- # eval 'reap_spdk_processes 12> /dev/null' 00:03:37.578 10:34:53 -- common/autotest_common.sh@22 -- # reap_spdk_processes 00:03:45.697 10:35:00 -- spdk/autotest.sh@135 -- # uname -s 00:03:45.697 10:35:00 -- spdk/autotest.sh@135 -- # '[' Linux = Linux ']' 00:03:45.697 10:35:00 -- spdk/autotest.sh@136 -- # run_test setup.sh /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/test-setup.sh 00:03:45.697 10:35:00 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:03:45.697 10:35:00 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:03:45.697 10:35:00 -- common/autotest_common.sh@10 -- # set +x 00:03:45.697 ************************************ 00:03:45.697 START TEST setup.sh 00:03:45.697 ************************************ 00:03:45.697 10:35:00 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/test-setup.sh 00:03:45.697 * Looking for test storage... 00:03:45.697 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:03:45.697 10:35:00 -- setup/test-setup.sh@10 -- # uname -s 00:03:45.697 10:35:00 -- setup/test-setup.sh@10 -- # [[ Linux == Linux ]] 00:03:45.697 10:35:00 -- setup/test-setup.sh@12 -- # run_test acl /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/acl.sh 00:03:45.697 10:35:00 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:03:45.697 10:35:00 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:03:45.697 10:35:00 -- common/autotest_common.sh@10 -- # set +x 00:03:45.697 ************************************ 00:03:45.697 START TEST acl 00:03:45.697 ************************************ 00:03:45.697 10:35:00 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/acl.sh 00:03:45.697 * Looking for test storage... 00:03:45.697 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:03:45.697 10:35:01 -- setup/acl.sh@10 -- # get_zoned_devs 00:03:45.697 10:35:01 -- common/autotest_common.sh@1654 -- # zoned_devs=() 00:03:45.697 10:35:01 -- common/autotest_common.sh@1654 -- # local -gA zoned_devs 00:03:45.697 10:35:01 -- common/autotest_common.sh@1655 -- # local nvme bdf 00:03:45.697 10:35:01 -- common/autotest_common.sh@1657 -- # for nvme in /sys/block/nvme* 00:03:45.697 10:35:01 -- common/autotest_common.sh@1658 -- # is_block_zoned nvme0n1 00:03:45.697 10:35:01 -- common/autotest_common.sh@1647 -- # local device=nvme0n1 00:03:45.697 10:35:01 -- common/autotest_common.sh@1649 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:03:45.697 10:35:01 -- common/autotest_common.sh@1650 -- # [[ none != none ]] 00:03:45.697 10:35:01 -- setup/acl.sh@12 -- # devs=() 00:03:45.697 10:35:01 -- setup/acl.sh@12 -- # declare -a devs 00:03:45.697 10:35:01 -- setup/acl.sh@13 -- # drivers=() 00:03:45.697 10:35:01 -- setup/acl.sh@13 -- # declare -A drivers 00:03:45.697 10:35:01 -- setup/acl.sh@51 -- # setup reset 00:03:45.697 10:35:01 -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:45.697 10:35:01 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:03:48.985 10:35:04 -- setup/acl.sh@52 -- # collect_setup_devs 00:03:48.985 10:35:04 -- setup/acl.sh@16 -- # local dev driver 00:03:48.985 10:35:04 -- setup/acl.sh@15 -- # setup output status 00:03:48.985 10:35:04 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:48.985 10:35:04 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:48.985 10:35:04 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:03:52.275 Hugepages 00:03:52.275 node hugesize free / total 00:03:52.275 10:35:07 -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:03:52.275 10:35:07 -- setup/acl.sh@19 -- # continue 00:03:52.275 10:35:07 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:52.275 10:35:07 -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:03:52.275 10:35:07 -- setup/acl.sh@19 -- # continue 00:03:52.275 10:35:07 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:52.275 10:35:07 -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:03:52.275 10:35:07 -- setup/acl.sh@19 -- # continue 00:03:52.275 10:35:07 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:52.275 00:03:52.275 Type BDF Vendor Device NUMA Driver Device Block devices 00:03:52.275 10:35:07 -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:03:52.275 10:35:07 -- setup/acl.sh@19 -- # continue 00:03:52.275 10:35:07 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:52.275 10:35:07 -- setup/acl.sh@19 -- # [[ 0000:00:04.0 == *:*:*.* ]] 00:03:52.275 10:35:07 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:52.275 10:35:07 -- setup/acl.sh@20 -- # continue 00:03:52.275 10:35:07 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:52.275 10:35:07 -- setup/acl.sh@19 -- # [[ 0000:00:04.1 == *:*:*.* ]] 00:03:52.275 10:35:07 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:52.275 10:35:07 -- setup/acl.sh@20 -- # continue 00:03:52.275 10:35:07 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:52.275 10:35:07 -- setup/acl.sh@19 -- # [[ 0000:00:04.2 == *:*:*.* ]] 00:03:52.275 10:35:07 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:52.275 10:35:07 -- setup/acl.sh@20 -- # continue 00:03:52.275 10:35:07 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:52.275 10:35:07 -- setup/acl.sh@19 -- # [[ 0000:00:04.3 == *:*:*.* ]] 00:03:52.275 10:35:07 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:52.275 10:35:07 -- setup/acl.sh@20 -- # continue 00:03:52.275 10:35:07 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:52.275 10:35:07 -- setup/acl.sh@19 -- # [[ 0000:00:04.4 == *:*:*.* ]] 00:03:52.275 10:35:07 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:52.275 10:35:07 -- setup/acl.sh@20 -- # continue 00:03:52.275 10:35:07 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:52.275 10:35:07 -- setup/acl.sh@19 -- # [[ 0000:00:04.5 == *:*:*.* ]] 00:03:52.275 10:35:07 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:52.275 10:35:07 -- setup/acl.sh@20 -- # continue 00:03:52.275 10:35:07 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:52.275 10:35:07 -- setup/acl.sh@19 -- # [[ 0000:00:04.6 == *:*:*.* ]] 00:03:52.275 10:35:07 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:52.275 10:35:07 -- setup/acl.sh@20 -- # continue 00:03:52.275 10:35:07 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:52.275 10:35:07 -- setup/acl.sh@19 -- # [[ 0000:00:04.7 == *:*:*.* ]] 00:03:52.275 10:35:07 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:52.275 10:35:07 -- setup/acl.sh@20 -- # continue 00:03:52.275 10:35:07 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:52.275 10:35:08 -- setup/acl.sh@19 -- # [[ 0000:80:04.0 == *:*:*.* ]] 00:03:52.275 10:35:08 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:52.275 10:35:08 -- setup/acl.sh@20 -- # continue 00:03:52.275 10:35:08 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:52.275 10:35:08 -- setup/acl.sh@19 -- # [[ 0000:80:04.1 == *:*:*.* ]] 00:03:52.275 10:35:08 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:52.275 10:35:08 -- setup/acl.sh@20 -- # continue 00:03:52.275 10:35:08 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:52.275 10:35:08 -- setup/acl.sh@19 -- # [[ 0000:80:04.2 == *:*:*.* ]] 00:03:52.275 10:35:08 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:52.275 10:35:08 -- setup/acl.sh@20 -- # continue 00:03:52.275 10:35:08 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:52.275 10:35:08 -- setup/acl.sh@19 -- # [[ 0000:80:04.3 == *:*:*.* ]] 00:03:52.275 10:35:08 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:52.275 10:35:08 -- setup/acl.sh@20 -- # continue 00:03:52.275 10:35:08 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:52.275 10:35:08 -- setup/acl.sh@19 -- # [[ 0000:80:04.4 == *:*:*.* ]] 00:03:52.275 10:35:08 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:52.275 10:35:08 -- setup/acl.sh@20 -- # continue 00:03:52.275 10:35:08 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:52.275 10:35:08 -- setup/acl.sh@19 -- # [[ 0000:80:04.5 == *:*:*.* ]] 00:03:52.275 10:35:08 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:52.275 10:35:08 -- setup/acl.sh@20 -- # continue 00:03:52.275 10:35:08 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:52.275 10:35:08 -- setup/acl.sh@19 -- # [[ 0000:80:04.6 == *:*:*.* ]] 00:03:52.275 10:35:08 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:52.275 10:35:08 -- setup/acl.sh@20 -- # continue 00:03:52.275 10:35:08 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:52.275 10:35:08 -- setup/acl.sh@19 -- # [[ 0000:80:04.7 == *:*:*.* ]] 00:03:52.275 10:35:08 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:52.275 10:35:08 -- setup/acl.sh@20 -- # continue 00:03:52.275 10:35:08 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:52.275 10:35:08 -- setup/acl.sh@19 -- # [[ 0000:d8:00.0 == *:*:*.* ]] 00:03:52.275 10:35:08 -- setup/acl.sh@20 -- # [[ nvme == nvme ]] 00:03:52.275 10:35:08 -- setup/acl.sh@21 -- # [[ '' == *\0\0\0\0\:\d\8\:\0\0\.\0* ]] 00:03:52.275 10:35:08 -- setup/acl.sh@22 -- # devs+=("$dev") 00:03:52.275 10:35:08 -- setup/acl.sh@22 -- # drivers["$dev"]=nvme 00:03:52.275 10:35:08 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:52.275 10:35:08 -- setup/acl.sh@24 -- # (( 1 > 0 )) 00:03:52.275 10:35:08 -- setup/acl.sh@54 -- # run_test denied denied 00:03:52.275 10:35:08 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:03:52.275 10:35:08 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:03:52.275 10:35:08 -- common/autotest_common.sh@10 -- # set +x 00:03:52.275 ************************************ 00:03:52.275 START TEST denied 00:03:52.275 ************************************ 00:03:52.275 10:35:08 -- common/autotest_common.sh@1104 -- # denied 00:03:52.275 10:35:08 -- setup/acl.sh@38 -- # PCI_BLOCKED=' 0000:d8:00.0' 00:03:52.275 10:35:08 -- setup/acl.sh@38 -- # setup output config 00:03:52.275 10:35:08 -- setup/acl.sh@39 -- # grep 'Skipping denied controller at 0000:d8:00.0' 00:03:52.275 10:35:08 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:52.275 10:35:08 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:03:55.563 0000:d8:00.0 (8086 0a54): Skipping denied controller at 0000:d8:00.0 00:03:55.563 10:35:11 -- setup/acl.sh@40 -- # verify 0000:d8:00.0 00:03:55.563 10:35:11 -- setup/acl.sh@28 -- # local dev driver 00:03:55.563 10:35:11 -- setup/acl.sh@30 -- # for dev in "$@" 00:03:55.563 10:35:11 -- setup/acl.sh@31 -- # [[ -e /sys/bus/pci/devices/0000:d8:00.0 ]] 00:03:55.563 10:35:11 -- setup/acl.sh@32 -- # readlink -f /sys/bus/pci/devices/0000:d8:00.0/driver 00:03:55.563 10:35:11 -- setup/acl.sh@32 -- # driver=/sys/bus/pci/drivers/nvme 00:03:55.563 10:35:11 -- setup/acl.sh@33 -- # [[ nvme == \n\v\m\e ]] 00:03:55.563 10:35:11 -- setup/acl.sh@41 -- # setup reset 00:03:55.563 10:35:11 -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:55.563 10:35:11 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:00.836 00:04:00.836 real 0m8.247s 00:04:00.836 user 0m2.620s 00:04:00.836 sys 0m5.002s 00:04:00.836 10:35:16 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:00.836 10:35:16 -- common/autotest_common.sh@10 -- # set +x 00:04:00.836 ************************************ 00:04:00.836 END TEST denied 00:04:00.836 ************************************ 00:04:00.836 10:35:16 -- setup/acl.sh@55 -- # run_test allowed allowed 00:04:00.836 10:35:16 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:00.836 10:35:16 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:00.836 10:35:16 -- common/autotest_common.sh@10 -- # set +x 00:04:00.836 ************************************ 00:04:00.836 START TEST allowed 00:04:00.836 ************************************ 00:04:00.836 10:35:16 -- common/autotest_common.sh@1104 -- # allowed 00:04:00.836 10:35:16 -- setup/acl.sh@45 -- # PCI_ALLOWED=0000:d8:00.0 00:04:00.836 10:35:16 -- setup/acl.sh@45 -- # setup output config 00:04:00.836 10:35:16 -- setup/acl.sh@46 -- # grep -E '0000:d8:00.0 .*: nvme -> .*' 00:04:00.836 10:35:16 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:00.836 10:35:16 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:05.032 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:04:05.032 10:35:21 -- setup/acl.sh@47 -- # verify 00:04:05.032 10:35:21 -- setup/acl.sh@28 -- # local dev driver 00:04:05.032 10:35:21 -- setup/acl.sh@48 -- # setup reset 00:04:05.032 10:35:21 -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:05.032 10:35:21 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:08.410 00:04:08.410 real 0m8.039s 00:04:08.410 user 0m2.105s 00:04:08.410 sys 0m4.471s 00:04:08.410 10:35:24 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:08.410 10:35:24 -- common/autotest_common.sh@10 -- # set +x 00:04:08.410 ************************************ 00:04:08.410 END TEST allowed 00:04:08.410 ************************************ 00:04:08.410 00:04:08.410 real 0m23.609s 00:04:08.410 user 0m7.451s 00:04:08.410 sys 0m14.368s 00:04:08.410 10:35:24 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:08.410 10:35:24 -- common/autotest_common.sh@10 -- # set +x 00:04:08.410 ************************************ 00:04:08.410 END TEST acl 00:04:08.410 ************************************ 00:04:08.410 10:35:24 -- setup/test-setup.sh@13 -- # run_test hugepages /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/hugepages.sh 00:04:08.410 10:35:24 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:08.411 10:35:24 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:08.411 10:35:24 -- common/autotest_common.sh@10 -- # set +x 00:04:08.411 ************************************ 00:04:08.411 START TEST hugepages 00:04:08.411 ************************************ 00:04:08.411 10:35:24 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/hugepages.sh 00:04:08.411 * Looking for test storage... 00:04:08.411 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:04:08.411 10:35:24 -- setup/hugepages.sh@10 -- # nodes_sys=() 00:04:08.411 10:35:24 -- setup/hugepages.sh@10 -- # declare -a nodes_sys 00:04:08.411 10:35:24 -- setup/hugepages.sh@12 -- # declare -i default_hugepages=0 00:04:08.411 10:35:24 -- setup/hugepages.sh@13 -- # declare -i no_nodes=0 00:04:08.411 10:35:24 -- setup/hugepages.sh@14 -- # declare -i nr_hugepages=0 00:04:08.411 10:35:24 -- setup/hugepages.sh@16 -- # get_meminfo Hugepagesize 00:04:08.411 10:35:24 -- setup/common.sh@17 -- # local get=Hugepagesize 00:04:08.411 10:35:24 -- setup/common.sh@18 -- # local node= 00:04:08.411 10:35:24 -- setup/common.sh@19 -- # local var val 00:04:08.411 10:35:24 -- setup/common.sh@20 -- # local mem_f mem 00:04:08.411 10:35:24 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:08.411 10:35:24 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:08.411 10:35:24 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:08.411 10:35:24 -- setup/common.sh@28 -- # mapfile -t mem 00:04:08.411 10:35:24 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:08.411 10:35:24 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.411 10:35:24 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.411 10:35:24 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 40403168 kB' 'MemAvailable: 42785320 kB' 'Buffers: 12536 kB' 'Cached: 11508356 kB' 'SwapCached: 16 kB' 'Active: 9742372 kB' 'Inactive: 2354388 kB' 'Active(anon): 9267020 kB' 'Inactive(anon): 57088 kB' 'Active(file): 475352 kB' 'Inactive(file): 2297300 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8387836 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 579304 kB' 'Mapped: 175340 kB' 'Shmem: 8748240 kB' 'KReclaimable: 246032 kB' 'Slab: 774480 kB' 'SReclaimable: 246032 kB' 'SUnreclaim: 528448 kB' 'KernelStack: 21904 kB' 'PageTables: 8400 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36439068 kB' 'Committed_AS: 10669124 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213316 kB' 'VmallocChunk: 0 kB' 'Percpu: 73472 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 2048' 'HugePages_Free: 2048' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 4194304 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:04:08.411 10:35:24 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:08.411 10:35:24 -- setup/common.sh@32 -- # continue 00:04:08.411 10:35:24 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.411 10:35:24 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.411 10:35:24 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:08.411 10:35:24 -- setup/common.sh@32 -- # continue 00:04:08.411 10:35:24 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.411 10:35:24 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.411 10:35:24 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:08.411 10:35:24 -- setup/common.sh@32 -- # continue 00:04:08.411 10:35:24 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.411 10:35:24 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.411 10:35:24 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:08.411 10:35:24 -- setup/common.sh@32 -- # continue 00:04:08.411 10:35:24 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.411 10:35:24 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.411 10:35:24 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:08.411 10:35:24 -- setup/common.sh@32 -- # continue 00:04:08.411 10:35:24 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.411 10:35:24 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.411 10:35:24 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:08.411 10:35:24 -- setup/common.sh@32 -- # continue 00:04:08.411 10:35:24 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.411 10:35:24 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.411 10:35:24 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:08.411 10:35:24 -- setup/common.sh@32 -- # continue 00:04:08.411 10:35:24 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.411 10:35:24 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.411 10:35:24 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:08.411 10:35:24 -- setup/common.sh@32 -- # continue 00:04:08.411 10:35:24 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.411 10:35:24 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.411 10:35:24 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:08.411 10:35:24 -- setup/common.sh@32 -- # continue 00:04:08.411 10:35:24 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.411 10:35:24 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.411 10:35:24 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:08.411 10:35:24 -- setup/common.sh@32 -- # continue 00:04:08.411 10:35:24 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.411 10:35:24 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.411 10:35:24 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:08.411 10:35:24 -- setup/common.sh@32 -- # continue 00:04:08.411 10:35:24 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.411 10:35:24 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.411 10:35:24 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:08.411 10:35:24 -- setup/common.sh@32 -- # continue 00:04:08.411 10:35:24 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.411 10:35:24 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.411 10:35:24 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:08.411 10:35:24 -- setup/common.sh@32 -- # continue 00:04:08.411 10:35:24 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.411 10:35:24 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.411 10:35:24 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:08.411 10:35:24 -- setup/common.sh@32 -- # continue 00:04:08.411 10:35:24 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.411 10:35:24 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.411 10:35:24 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:08.411 10:35:24 -- setup/common.sh@32 -- # continue 00:04:08.411 10:35:24 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.411 10:35:24 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.411 10:35:24 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:08.411 10:35:24 -- setup/common.sh@32 -- # continue 00:04:08.411 10:35:24 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.411 10:35:24 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.411 10:35:24 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:08.411 10:35:24 -- setup/common.sh@32 -- # continue 00:04:08.411 10:35:24 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.411 10:35:24 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.411 10:35:24 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:08.411 10:35:24 -- setup/common.sh@32 -- # continue 00:04:08.411 10:35:24 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.411 10:35:24 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.411 10:35:24 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:08.411 10:35:24 -- setup/common.sh@32 -- # continue 00:04:08.411 10:35:24 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.411 10:35:24 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.411 10:35:24 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:08.411 10:35:24 -- setup/common.sh@32 -- # continue 00:04:08.411 10:35:24 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.411 10:35:24 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.412 10:35:24 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:08.412 10:35:24 -- setup/common.sh@32 -- # continue 00:04:08.412 10:35:24 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.412 10:35:24 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.412 10:35:24 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:08.412 10:35:24 -- setup/common.sh@32 -- # continue 00:04:08.412 10:35:24 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.412 10:35:24 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.412 10:35:24 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:08.412 10:35:24 -- setup/common.sh@32 -- # continue 00:04:08.412 10:35:24 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.412 10:35:24 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.412 10:35:24 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:08.412 10:35:24 -- setup/common.sh@32 -- # continue 00:04:08.412 10:35:24 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.412 10:35:24 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.412 10:35:24 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:08.412 10:35:24 -- setup/common.sh@32 -- # continue 00:04:08.412 10:35:24 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.412 10:35:24 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.412 10:35:24 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:08.412 10:35:24 -- setup/common.sh@32 -- # continue 00:04:08.412 10:35:24 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.412 10:35:24 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.412 10:35:24 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:08.412 10:35:24 -- setup/common.sh@32 -- # continue 00:04:08.412 10:35:24 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.412 10:35:24 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.412 10:35:24 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:08.412 10:35:24 -- setup/common.sh@32 -- # continue 00:04:08.412 10:35:24 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.412 10:35:24 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.412 10:35:24 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:08.412 10:35:24 -- setup/common.sh@32 -- # continue 00:04:08.412 10:35:24 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.412 10:35:24 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.412 10:35:24 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:08.412 10:35:24 -- setup/common.sh@32 -- # continue 00:04:08.412 10:35:24 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.412 10:35:24 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.412 10:35:24 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:08.412 10:35:24 -- setup/common.sh@32 -- # continue 00:04:08.412 10:35:24 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.412 10:35:24 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.412 10:35:24 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:08.412 10:35:24 -- setup/common.sh@32 -- # continue 00:04:08.412 10:35:24 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.412 10:35:24 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.412 10:35:24 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:08.412 10:35:24 -- setup/common.sh@32 -- # continue 00:04:08.412 10:35:24 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.412 10:35:24 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.412 10:35:24 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:08.412 10:35:24 -- setup/common.sh@32 -- # continue 00:04:08.412 10:35:24 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.412 10:35:24 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.412 10:35:24 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:08.412 10:35:24 -- setup/common.sh@32 -- # continue 00:04:08.412 10:35:24 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.412 10:35:24 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.412 10:35:24 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:08.412 10:35:24 -- setup/common.sh@32 -- # continue 00:04:08.412 10:35:24 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.412 10:35:24 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.412 10:35:24 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:08.412 10:35:24 -- setup/common.sh@32 -- # continue 00:04:08.412 10:35:24 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.412 10:35:24 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.412 10:35:24 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:08.412 10:35:24 -- setup/common.sh@32 -- # continue 00:04:08.412 10:35:24 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.412 10:35:24 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.412 10:35:24 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:08.412 10:35:24 -- setup/common.sh@32 -- # continue 00:04:08.412 10:35:24 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.412 10:35:24 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.412 10:35:24 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:08.412 10:35:24 -- setup/common.sh@32 -- # continue 00:04:08.412 10:35:24 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.412 10:35:24 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.412 10:35:24 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:08.412 10:35:24 -- setup/common.sh@32 -- # continue 00:04:08.412 10:35:24 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.412 10:35:24 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.412 10:35:24 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:08.412 10:35:24 -- setup/common.sh@32 -- # continue 00:04:08.412 10:35:24 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.412 10:35:24 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.412 10:35:24 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:08.412 10:35:24 -- setup/common.sh@32 -- # continue 00:04:08.412 10:35:24 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.412 10:35:24 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.412 10:35:24 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:08.412 10:35:24 -- setup/common.sh@32 -- # continue 00:04:08.412 10:35:24 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.412 10:35:24 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.412 10:35:24 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:08.412 10:35:24 -- setup/common.sh@32 -- # continue 00:04:08.412 10:35:24 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.412 10:35:24 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.412 10:35:24 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:08.412 10:35:24 -- setup/common.sh@32 -- # continue 00:04:08.412 10:35:24 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.412 10:35:24 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.412 10:35:24 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:08.412 10:35:24 -- setup/common.sh@32 -- # continue 00:04:08.412 10:35:24 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.412 10:35:24 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.412 10:35:24 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:08.412 10:35:24 -- setup/common.sh@32 -- # continue 00:04:08.412 10:35:24 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.412 10:35:24 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.412 10:35:24 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:08.412 10:35:24 -- setup/common.sh@32 -- # continue 00:04:08.412 10:35:24 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.412 10:35:24 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.412 10:35:24 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:08.412 10:35:24 -- setup/common.sh@32 -- # continue 00:04:08.412 10:35:24 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.412 10:35:24 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.412 10:35:24 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:08.412 10:35:24 -- setup/common.sh@32 -- # continue 00:04:08.412 10:35:24 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.412 10:35:24 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.412 10:35:24 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:08.412 10:35:24 -- setup/common.sh@32 -- # continue 00:04:08.412 10:35:24 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.412 10:35:24 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.412 10:35:24 -- setup/common.sh@32 -- # [[ Hugepagesize == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:08.412 10:35:24 -- setup/common.sh@33 -- # echo 2048 00:04:08.412 10:35:24 -- setup/common.sh@33 -- # return 0 00:04:08.412 10:35:24 -- setup/hugepages.sh@16 -- # default_hugepages=2048 00:04:08.413 10:35:24 -- setup/hugepages.sh@17 -- # default_huge_nr=/sys/kernel/mm/hugepages/hugepages-2048kB/nr_hugepages 00:04:08.413 10:35:24 -- setup/hugepages.sh@18 -- # global_huge_nr=/proc/sys/vm/nr_hugepages 00:04:08.413 10:35:24 -- setup/hugepages.sh@21 -- # unset -v HUGE_EVEN_ALLOC 00:04:08.413 10:35:24 -- setup/hugepages.sh@22 -- # unset -v HUGEMEM 00:04:08.413 10:35:24 -- setup/hugepages.sh@23 -- # unset -v HUGENODE 00:04:08.413 10:35:24 -- setup/hugepages.sh@24 -- # unset -v NRHUGE 00:04:08.413 10:35:24 -- setup/hugepages.sh@207 -- # get_nodes 00:04:08.413 10:35:24 -- setup/hugepages.sh@27 -- # local node 00:04:08.413 10:35:24 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:08.413 10:35:24 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=2048 00:04:08.413 10:35:24 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:08.413 10:35:24 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:04:08.413 10:35:24 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:08.413 10:35:24 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:08.413 10:35:24 -- setup/hugepages.sh@208 -- # clear_hp 00:04:08.413 10:35:24 -- setup/hugepages.sh@37 -- # local node hp 00:04:08.413 10:35:24 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:04:08.413 10:35:24 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:08.413 10:35:24 -- setup/hugepages.sh@41 -- # echo 0 00:04:08.413 10:35:24 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:08.413 10:35:24 -- setup/hugepages.sh@41 -- # echo 0 00:04:08.413 10:35:24 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:04:08.413 10:35:24 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:08.413 10:35:24 -- setup/hugepages.sh@41 -- # echo 0 00:04:08.413 10:35:24 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:08.413 10:35:24 -- setup/hugepages.sh@41 -- # echo 0 00:04:08.413 10:35:24 -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:04:08.413 10:35:24 -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:04:08.413 10:35:24 -- setup/hugepages.sh@210 -- # run_test default_setup default_setup 00:04:08.413 10:35:24 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:08.413 10:35:24 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:08.413 10:35:24 -- common/autotest_common.sh@10 -- # set +x 00:04:08.413 ************************************ 00:04:08.413 START TEST default_setup 00:04:08.413 ************************************ 00:04:08.413 10:35:24 -- common/autotest_common.sh@1104 -- # default_setup 00:04:08.413 10:35:24 -- setup/hugepages.sh@136 -- # get_test_nr_hugepages 2097152 0 00:04:08.413 10:35:24 -- setup/hugepages.sh@49 -- # local size=2097152 00:04:08.413 10:35:24 -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:04:08.413 10:35:24 -- setup/hugepages.sh@51 -- # shift 00:04:08.413 10:35:24 -- setup/hugepages.sh@52 -- # node_ids=('0') 00:04:08.413 10:35:24 -- setup/hugepages.sh@52 -- # local node_ids 00:04:08.413 10:35:24 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:08.413 10:35:24 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:08.413 10:35:24 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:04:08.413 10:35:24 -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:04:08.413 10:35:24 -- setup/hugepages.sh@62 -- # local user_nodes 00:04:08.413 10:35:24 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:08.413 10:35:24 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:08.413 10:35:24 -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:08.413 10:35:24 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:08.413 10:35:24 -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:04:08.413 10:35:24 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:04:08.413 10:35:24 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:04:08.413 10:35:24 -- setup/hugepages.sh@73 -- # return 0 00:04:08.413 10:35:24 -- setup/hugepages.sh@137 -- # setup output 00:04:08.413 10:35:24 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:08.413 10:35:24 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:11.704 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:11.704 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:11.704 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:11.704 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:11.704 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:11.704 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:11.704 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:11.704 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:04:11.704 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:11.704 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:11.704 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:11.704 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:11.704 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:11.704 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:11.704 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:11.704 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:04:13.613 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:04:13.613 10:35:29 -- setup/hugepages.sh@138 -- # verify_nr_hugepages 00:04:13.613 10:35:29 -- setup/hugepages.sh@89 -- # local node 00:04:13.613 10:35:29 -- setup/hugepages.sh@90 -- # local sorted_t 00:04:13.613 10:35:29 -- setup/hugepages.sh@91 -- # local sorted_s 00:04:13.613 10:35:29 -- setup/hugepages.sh@92 -- # local surp 00:04:13.613 10:35:29 -- setup/hugepages.sh@93 -- # local resv 00:04:13.613 10:35:29 -- setup/hugepages.sh@94 -- # local anon 00:04:13.613 10:35:29 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:13.613 10:35:29 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:13.613 10:35:29 -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:13.613 10:35:29 -- setup/common.sh@18 -- # local node= 00:04:13.613 10:35:29 -- setup/common.sh@19 -- # local var val 00:04:13.613 10:35:29 -- setup/common.sh@20 -- # local mem_f mem 00:04:13.613 10:35:29 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:13.613 10:35:29 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:13.613 10:35:29 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:13.613 10:35:29 -- setup/common.sh@28 -- # mapfile -t mem 00:04:13.613 10:35:29 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:13.613 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.613 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.613 10:35:29 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 42584312 kB' 'MemAvailable: 44966424 kB' 'Buffers: 12536 kB' 'Cached: 11508480 kB' 'SwapCached: 16 kB' 'Active: 9752460 kB' 'Inactive: 2354388 kB' 'Active(anon): 9277108 kB' 'Inactive(anon): 57088 kB' 'Active(file): 475352 kB' 'Inactive(file): 2297300 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8387836 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 588676 kB' 'Mapped: 176128 kB' 'Shmem: 8748364 kB' 'KReclaimable: 245952 kB' 'Slab: 773200 kB' 'SReclaimable: 245952 kB' 'SUnreclaim: 527248 kB' 'KernelStack: 21904 kB' 'PageTables: 8472 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 10680536 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213348 kB' 'VmallocChunk: 0 kB' 'Percpu: 73472 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:04:13.613 10:35:29 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.613 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.613 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.613 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.613 10:35:29 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.613 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.613 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.613 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.613 10:35:29 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.613 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.613 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.613 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.613 10:35:29 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.613 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.613 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.613 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.613 10:35:29 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.613 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.613 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.613 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.613 10:35:29 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.613 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.613 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.613 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.613 10:35:29 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.613 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.613 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.613 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.613 10:35:29 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.614 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.614 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.614 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.614 10:35:29 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.614 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.614 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.614 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.614 10:35:29 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.614 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.614 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.614 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.614 10:35:29 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.614 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.614 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.614 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.614 10:35:29 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.614 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.614 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.614 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.614 10:35:29 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.614 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.614 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.614 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.614 10:35:29 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.614 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.614 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.614 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.614 10:35:29 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.614 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.614 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.614 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.614 10:35:29 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.614 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.614 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.614 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.614 10:35:29 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.614 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.614 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.614 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.614 10:35:29 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.614 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.614 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.614 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.614 10:35:29 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.614 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.614 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.614 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.614 10:35:29 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.614 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.614 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.614 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.614 10:35:29 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.614 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.614 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.614 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.614 10:35:29 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.614 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.614 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.614 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.614 10:35:29 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.614 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.614 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.614 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.614 10:35:29 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.614 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.614 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.614 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.614 10:35:29 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.614 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.614 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.614 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.614 10:35:29 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.614 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.614 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.614 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.614 10:35:29 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.614 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.614 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.614 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.614 10:35:29 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.614 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.614 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.614 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.614 10:35:29 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.614 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.614 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.614 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.614 10:35:29 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.614 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.614 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.614 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.614 10:35:29 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.614 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.614 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.614 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.614 10:35:29 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.614 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.614 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.614 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.614 10:35:29 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.614 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.614 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.614 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.614 10:35:29 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.614 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.614 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.614 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.614 10:35:29 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.614 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.614 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.614 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.614 10:35:29 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.614 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.614 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.614 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.614 10:35:29 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.614 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.614 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.614 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.614 10:35:29 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.614 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.614 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.614 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.614 10:35:29 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.614 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.614 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.614 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.614 10:35:29 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.614 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.614 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.614 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.614 10:35:29 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.614 10:35:29 -- setup/common.sh@33 -- # echo 0 00:04:13.614 10:35:29 -- setup/common.sh@33 -- # return 0 00:04:13.614 10:35:29 -- setup/hugepages.sh@97 -- # anon=0 00:04:13.614 10:35:29 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:13.614 10:35:29 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:13.614 10:35:29 -- setup/common.sh@18 -- # local node= 00:04:13.614 10:35:29 -- setup/common.sh@19 -- # local var val 00:04:13.614 10:35:29 -- setup/common.sh@20 -- # local mem_f mem 00:04:13.614 10:35:29 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:13.614 10:35:29 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:13.614 10:35:29 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:13.614 10:35:29 -- setup/common.sh@28 -- # mapfile -t mem 00:04:13.614 10:35:29 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:13.614 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.614 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.614 10:35:29 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 42580032 kB' 'MemAvailable: 44962144 kB' 'Buffers: 12536 kB' 'Cached: 11508484 kB' 'SwapCached: 16 kB' 'Active: 9755700 kB' 'Inactive: 2354388 kB' 'Active(anon): 9280348 kB' 'Inactive(anon): 57088 kB' 'Active(file): 475352 kB' 'Inactive(file): 2297300 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8387836 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 591888 kB' 'Mapped: 176088 kB' 'Shmem: 8748368 kB' 'KReclaimable: 245952 kB' 'Slab: 773200 kB' 'SReclaimable: 245952 kB' 'SUnreclaim: 527248 kB' 'KernelStack: 21984 kB' 'PageTables: 8496 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 10684244 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213380 kB' 'VmallocChunk: 0 kB' 'Percpu: 73472 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:04:13.614 10:35:29 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.614 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.614 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.614 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.614 10:35:29 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.614 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.614 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.614 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.615 10:35:29 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.615 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.615 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.615 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.615 10:35:29 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.615 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.615 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.615 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.615 10:35:29 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.615 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.615 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.615 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.615 10:35:29 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.615 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.615 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.615 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.615 10:35:29 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.615 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.615 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.615 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.615 10:35:29 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.615 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.615 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.615 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.615 10:35:29 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.615 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.615 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.615 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.615 10:35:29 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.615 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.615 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.615 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.615 10:35:29 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.615 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.615 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.615 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.615 10:35:29 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.615 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.615 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.615 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.615 10:35:29 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.615 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.615 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.615 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.615 10:35:29 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.615 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.615 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.615 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.615 10:35:29 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.615 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.615 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.615 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.615 10:35:29 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.615 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.615 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.615 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.615 10:35:29 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.615 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.615 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.615 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.615 10:35:29 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.615 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.615 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.615 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.615 10:35:29 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.615 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.615 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.615 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.615 10:35:29 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.615 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.615 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.615 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.615 10:35:29 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.615 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.615 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.615 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.615 10:35:29 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.615 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.615 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.615 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.615 10:35:29 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.615 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.615 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.615 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.615 10:35:29 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.615 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.615 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.615 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.615 10:35:29 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.615 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.615 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.615 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.615 10:35:29 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.615 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.615 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.615 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.615 10:35:29 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.615 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.615 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.615 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.615 10:35:29 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.615 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.615 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.615 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.615 10:35:29 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.615 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.615 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.615 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.615 10:35:29 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.615 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.615 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.615 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.615 10:35:29 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.615 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.615 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.615 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.615 10:35:29 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.615 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.615 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.615 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.615 10:35:29 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.615 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.615 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.615 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.615 10:35:29 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.615 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.615 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.615 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.615 10:35:29 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.615 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.615 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.615 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.615 10:35:29 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.615 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.615 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.615 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.615 10:35:29 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.615 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.615 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.615 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.615 10:35:29 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.615 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.615 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.615 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.615 10:35:29 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.615 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.615 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.615 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.615 10:35:29 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.615 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.615 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.615 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.615 10:35:29 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.616 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.616 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.616 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.616 10:35:29 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.616 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.616 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.616 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.616 10:35:29 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.616 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.616 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.616 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.616 10:35:29 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.616 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.616 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.616 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.616 10:35:29 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.616 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.616 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.616 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.616 10:35:29 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.616 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.616 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.616 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.616 10:35:29 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.616 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.616 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.616 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.616 10:35:29 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.616 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.616 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.616 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.616 10:35:29 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.616 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.616 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.616 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.616 10:35:29 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.616 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.616 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.616 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.616 10:35:29 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.616 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.616 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.616 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.616 10:35:29 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.616 10:35:29 -- setup/common.sh@33 -- # echo 0 00:04:13.616 10:35:29 -- setup/common.sh@33 -- # return 0 00:04:13.616 10:35:29 -- setup/hugepages.sh@99 -- # surp=0 00:04:13.616 10:35:29 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:13.616 10:35:29 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:13.616 10:35:29 -- setup/common.sh@18 -- # local node= 00:04:13.616 10:35:29 -- setup/common.sh@19 -- # local var val 00:04:13.616 10:35:29 -- setup/common.sh@20 -- # local mem_f mem 00:04:13.616 10:35:29 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:13.616 10:35:29 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:13.616 10:35:29 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:13.616 10:35:29 -- setup/common.sh@28 -- # mapfile -t mem 00:04:13.616 10:35:29 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:13.616 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.616 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.616 10:35:29 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 42580320 kB' 'MemAvailable: 44962432 kB' 'Buffers: 12536 kB' 'Cached: 11508484 kB' 'SwapCached: 16 kB' 'Active: 9755756 kB' 'Inactive: 2354388 kB' 'Active(anon): 9280404 kB' 'Inactive(anon): 57088 kB' 'Active(file): 475352 kB' 'Inactive(file): 2297300 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8387836 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 592384 kB' 'Mapped: 176356 kB' 'Shmem: 8748368 kB' 'KReclaimable: 245952 kB' 'Slab: 773180 kB' 'SReclaimable: 245952 kB' 'SUnreclaim: 527228 kB' 'KernelStack: 22048 kB' 'PageTables: 8488 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 10685460 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213448 kB' 'VmallocChunk: 0 kB' 'Percpu: 73472 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:04:13.616 10:35:29 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.616 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.616 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.616 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.616 10:35:29 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.616 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.616 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.616 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.616 10:35:29 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.616 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.616 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.616 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.616 10:35:29 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.616 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.616 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.616 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.616 10:35:29 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.616 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.616 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.616 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.616 10:35:29 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.616 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.616 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.616 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.616 10:35:29 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.616 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.616 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.616 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.616 10:35:29 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.616 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.616 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.616 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.616 10:35:29 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.616 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.616 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.616 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.616 10:35:29 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.616 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.616 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.616 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.616 10:35:29 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.616 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.616 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.616 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.616 10:35:29 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.616 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.616 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.616 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.616 10:35:29 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.616 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.616 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.616 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.616 10:35:29 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.616 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.616 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.616 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.616 10:35:29 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.616 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.616 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.616 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.616 10:35:29 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.616 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.616 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.616 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.616 10:35:29 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.616 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.616 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.616 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.616 10:35:29 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.616 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.616 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.616 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.616 10:35:29 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.616 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.616 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.616 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.616 10:35:29 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.616 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.616 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.616 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.616 10:35:29 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.616 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.616 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.616 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.616 10:35:29 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.616 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.616 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.616 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.616 10:35:29 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.616 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.616 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.616 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.616 10:35:29 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.616 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.616 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.616 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.617 10:35:29 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.617 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.617 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.617 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.617 10:35:29 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.617 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.617 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.617 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.617 10:35:29 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.617 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.617 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.617 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.617 10:35:29 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.617 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.617 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.617 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.617 10:35:29 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.617 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.617 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.617 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.617 10:35:29 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.617 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.617 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.617 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.617 10:35:29 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.617 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.617 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.617 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.617 10:35:29 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.617 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.617 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.617 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.617 10:35:29 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.617 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.617 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.617 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.617 10:35:29 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.617 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.617 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.617 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.617 10:35:29 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.617 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.617 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.617 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.617 10:35:29 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.617 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.617 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.617 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.617 10:35:29 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.617 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.617 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.617 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.617 10:35:29 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.617 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.617 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.617 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.617 10:35:29 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.617 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.617 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.617 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.617 10:35:29 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.617 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.617 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.617 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.617 10:35:29 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.617 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.617 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.617 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.617 10:35:29 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.617 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.617 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.617 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.617 10:35:29 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.617 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.617 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.617 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.617 10:35:29 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.617 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.617 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.617 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.617 10:35:29 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.617 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.617 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.617 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.617 10:35:29 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.617 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.617 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.617 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.617 10:35:29 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.617 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.617 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.617 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.617 10:35:29 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.617 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.617 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.617 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.617 10:35:29 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.617 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.617 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.617 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.617 10:35:29 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.617 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.617 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.617 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.617 10:35:29 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.617 10:35:29 -- setup/common.sh@33 -- # echo 0 00:04:13.617 10:35:29 -- setup/common.sh@33 -- # return 0 00:04:13.617 10:35:29 -- setup/hugepages.sh@100 -- # resv=0 00:04:13.617 10:35:29 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:13.617 nr_hugepages=1024 00:04:13.617 10:35:29 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:13.617 resv_hugepages=0 00:04:13.617 10:35:29 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:13.617 surplus_hugepages=0 00:04:13.617 10:35:29 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:13.617 anon_hugepages=0 00:04:13.617 10:35:29 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:13.617 10:35:29 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:13.617 10:35:29 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:13.617 10:35:29 -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:13.617 10:35:29 -- setup/common.sh@18 -- # local node= 00:04:13.617 10:35:29 -- setup/common.sh@19 -- # local var val 00:04:13.617 10:35:29 -- setup/common.sh@20 -- # local mem_f mem 00:04:13.617 10:35:29 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:13.617 10:35:29 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:13.617 10:35:29 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:13.617 10:35:29 -- setup/common.sh@28 -- # mapfile -t mem 00:04:13.617 10:35:29 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:13.617 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.617 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.617 10:35:29 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 42584832 kB' 'MemAvailable: 44966944 kB' 'Buffers: 12536 kB' 'Cached: 11508508 kB' 'SwapCached: 16 kB' 'Active: 9751144 kB' 'Inactive: 2354388 kB' 'Active(anon): 9275792 kB' 'Inactive(anon): 57088 kB' 'Active(file): 475352 kB' 'Inactive(file): 2297300 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8387836 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 587680 kB' 'Mapped: 176008 kB' 'Shmem: 8748392 kB' 'KReclaimable: 245952 kB' 'Slab: 773180 kB' 'SReclaimable: 245952 kB' 'SUnreclaim: 527228 kB' 'KernelStack: 21888 kB' 'PageTables: 8148 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 10680976 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213460 kB' 'VmallocChunk: 0 kB' 'Percpu: 73472 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:04:13.617 10:35:29 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.617 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.617 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.617 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.617 10:35:29 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.617 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.617 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.617 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.617 10:35:29 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.617 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.617 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.617 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.617 10:35:29 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.617 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.617 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.617 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.617 10:35:29 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.617 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.617 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.617 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.617 10:35:29 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.617 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.617 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.617 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.617 10:35:29 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.617 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.617 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.617 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.618 10:35:29 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.618 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.618 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.618 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.618 10:35:29 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.618 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.618 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.618 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.618 10:35:29 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.618 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.618 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.618 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.618 10:35:29 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.618 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.618 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.618 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.618 10:35:29 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.618 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.618 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.618 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.618 10:35:29 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.618 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.618 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.618 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.618 10:35:29 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.618 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.618 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.618 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.618 10:35:29 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.618 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.618 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.618 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.618 10:35:29 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.618 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.618 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.618 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.618 10:35:29 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.618 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.618 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.618 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.618 10:35:29 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.618 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.618 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.618 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.618 10:35:29 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.618 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.618 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.618 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.618 10:35:29 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.618 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.618 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.618 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.618 10:35:29 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.618 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.618 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.618 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.618 10:35:29 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.618 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.618 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.618 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.618 10:35:29 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.618 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.618 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.618 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.618 10:35:29 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.618 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.618 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.618 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.618 10:35:29 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.618 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.618 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.618 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.618 10:35:29 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.618 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.618 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.618 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.618 10:35:29 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.618 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.618 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.618 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.618 10:35:29 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.618 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.618 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.618 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.618 10:35:29 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.618 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.618 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.618 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.618 10:35:29 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.618 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.618 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.618 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.618 10:35:29 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.618 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.618 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.618 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.618 10:35:29 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.618 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.618 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.618 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.618 10:35:29 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.618 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.618 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.618 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.618 10:35:29 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.618 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.618 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.618 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.618 10:35:29 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.618 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.618 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.618 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.618 10:35:29 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.618 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.618 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.618 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.618 10:35:29 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.618 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.618 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.618 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.618 10:35:29 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.618 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.618 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.618 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.618 10:35:29 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.618 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.618 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.618 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.618 10:35:29 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.618 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.618 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.618 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.618 10:35:29 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.618 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.618 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.618 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.618 10:35:29 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.618 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.618 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.618 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.618 10:35:29 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.618 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.618 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.619 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.619 10:35:29 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.619 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.619 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.619 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.619 10:35:29 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.619 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.619 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.619 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.619 10:35:29 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.619 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.619 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.619 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.619 10:35:29 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.619 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.619 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.619 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.619 10:35:29 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.619 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.619 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.619 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.619 10:35:29 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.619 10:35:29 -- setup/common.sh@33 -- # echo 1024 00:04:13.619 10:35:29 -- setup/common.sh@33 -- # return 0 00:04:13.619 10:35:29 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:13.619 10:35:29 -- setup/hugepages.sh@112 -- # get_nodes 00:04:13.619 10:35:29 -- setup/hugepages.sh@27 -- # local node 00:04:13.619 10:35:29 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:13.619 10:35:29 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:13.619 10:35:29 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:13.619 10:35:29 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:04:13.619 10:35:29 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:13.619 10:35:29 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:13.619 10:35:29 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:13.619 10:35:29 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:13.619 10:35:29 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:13.619 10:35:29 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:13.619 10:35:29 -- setup/common.sh@18 -- # local node=0 00:04:13.619 10:35:29 -- setup/common.sh@19 -- # local var val 00:04:13.619 10:35:29 -- setup/common.sh@20 -- # local mem_f mem 00:04:13.619 10:35:29 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:13.619 10:35:29 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:13.619 10:35:29 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:13.619 10:35:29 -- setup/common.sh@28 -- # mapfile -t mem 00:04:13.619 10:35:29 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:13.619 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.619 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.619 10:35:29 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32592084 kB' 'MemFree: 25413028 kB' 'MemUsed: 7179056 kB' 'SwapCached: 16 kB' 'Active: 3396460 kB' 'Inactive: 180704 kB' 'Active(anon): 3179840 kB' 'Inactive(anon): 16 kB' 'Active(file): 216620 kB' 'Inactive(file): 180688 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3353720 kB' 'Mapped: 107792 kB' 'AnonPages: 227144 kB' 'Shmem: 2956396 kB' 'KernelStack: 12328 kB' 'PageTables: 4132 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 124976 kB' 'Slab: 366304 kB' 'SReclaimable: 124976 kB' 'SUnreclaim: 241328 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:13.619 10:35:29 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.619 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.619 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.619 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.619 10:35:29 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.619 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.619 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.619 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.619 10:35:29 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.619 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.619 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.619 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.619 10:35:29 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.619 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.619 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.619 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.619 10:35:29 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.619 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.619 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.619 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.619 10:35:29 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.619 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.619 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.619 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.619 10:35:29 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.619 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.619 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.619 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.619 10:35:29 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.619 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.619 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.619 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.619 10:35:29 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.619 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.619 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.619 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.619 10:35:29 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.619 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.619 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.619 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.619 10:35:29 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.619 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.619 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.619 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.619 10:35:29 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.619 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.619 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.619 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.619 10:35:29 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.619 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.619 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.619 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.619 10:35:29 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.619 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.619 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.619 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.619 10:35:29 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.619 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.619 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.619 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.619 10:35:29 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.619 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.619 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.619 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.619 10:35:29 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.619 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.619 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.619 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.619 10:35:29 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.619 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.619 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.619 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.619 10:35:29 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.619 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.619 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.619 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.619 10:35:29 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.619 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.619 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.619 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.619 10:35:29 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.619 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.619 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.619 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.619 10:35:29 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.619 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.619 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.619 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.619 10:35:29 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.619 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.619 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.619 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.619 10:35:29 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.619 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.619 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.619 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.619 10:35:29 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.619 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.619 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.619 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.619 10:35:29 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.619 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.619 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.619 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.619 10:35:29 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.619 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.619 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.619 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.619 10:35:29 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.619 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.619 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.619 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.619 10:35:29 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.619 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.620 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.620 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.620 10:35:29 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.620 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.620 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.620 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.620 10:35:29 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.620 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.620 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.620 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.620 10:35:29 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.620 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.620 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.620 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.620 10:35:29 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.620 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.620 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.620 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.620 10:35:29 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.620 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.620 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.620 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.620 10:35:29 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.620 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.620 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.620 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.620 10:35:29 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.620 10:35:29 -- setup/common.sh@32 -- # continue 00:04:13.620 10:35:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.620 10:35:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.620 10:35:29 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.620 10:35:29 -- setup/common.sh@33 -- # echo 0 00:04:13.620 10:35:29 -- setup/common.sh@33 -- # return 0 00:04:13.620 10:35:29 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:13.620 10:35:29 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:13.620 10:35:29 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:13.620 10:35:29 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:13.620 10:35:29 -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:04:13.620 node0=1024 expecting 1024 00:04:13.620 10:35:29 -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:04:13.620 00:04:13.620 real 0m4.924s 00:04:13.620 user 0m1.188s 00:04:13.620 sys 0m2.184s 00:04:13.620 10:35:29 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:13.620 10:35:29 -- common/autotest_common.sh@10 -- # set +x 00:04:13.620 ************************************ 00:04:13.620 END TEST default_setup 00:04:13.620 ************************************ 00:04:13.620 10:35:29 -- setup/hugepages.sh@211 -- # run_test per_node_1G_alloc per_node_1G_alloc 00:04:13.620 10:35:29 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:13.620 10:35:29 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:13.620 10:35:29 -- common/autotest_common.sh@10 -- # set +x 00:04:13.620 ************************************ 00:04:13.620 START TEST per_node_1G_alloc 00:04:13.620 ************************************ 00:04:13.620 10:35:29 -- common/autotest_common.sh@1104 -- # per_node_1G_alloc 00:04:13.620 10:35:29 -- setup/hugepages.sh@143 -- # local IFS=, 00:04:13.620 10:35:29 -- setup/hugepages.sh@145 -- # get_test_nr_hugepages 1048576 0 1 00:04:13.620 10:35:29 -- setup/hugepages.sh@49 -- # local size=1048576 00:04:13.620 10:35:29 -- setup/hugepages.sh@50 -- # (( 3 > 1 )) 00:04:13.620 10:35:29 -- setup/hugepages.sh@51 -- # shift 00:04:13.620 10:35:29 -- setup/hugepages.sh@52 -- # node_ids=('0' '1') 00:04:13.620 10:35:29 -- setup/hugepages.sh@52 -- # local node_ids 00:04:13.620 10:35:29 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:13.620 10:35:29 -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:04:13.620 10:35:29 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 1 00:04:13.620 10:35:29 -- setup/hugepages.sh@62 -- # user_nodes=('0' '1') 00:04:13.620 10:35:29 -- setup/hugepages.sh@62 -- # local user_nodes 00:04:13.620 10:35:29 -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:04:13.620 10:35:29 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:13.620 10:35:29 -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:13.620 10:35:29 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:13.620 10:35:29 -- setup/hugepages.sh@69 -- # (( 2 > 0 )) 00:04:13.620 10:35:29 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:04:13.620 10:35:29 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:04:13.620 10:35:29 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:04:13.620 10:35:29 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:04:13.620 10:35:29 -- setup/hugepages.sh@73 -- # return 0 00:04:13.620 10:35:29 -- setup/hugepages.sh@146 -- # NRHUGE=512 00:04:13.620 10:35:29 -- setup/hugepages.sh@146 -- # HUGENODE=0,1 00:04:13.620 10:35:29 -- setup/hugepages.sh@146 -- # setup output 00:04:13.620 10:35:29 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:13.620 10:35:29 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:16.957 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:16.957 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:16.957 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:16.957 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:16.957 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:16.957 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:16.957 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:16.957 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:16.957 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:16.957 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:16.957 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:16.957 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:16.957 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:16.957 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:16.957 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:16.957 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:16.957 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:16.957 10:35:33 -- setup/hugepages.sh@147 -- # nr_hugepages=1024 00:04:16.957 10:35:33 -- setup/hugepages.sh@147 -- # verify_nr_hugepages 00:04:16.957 10:35:33 -- setup/hugepages.sh@89 -- # local node 00:04:16.957 10:35:33 -- setup/hugepages.sh@90 -- # local sorted_t 00:04:16.957 10:35:33 -- setup/hugepages.sh@91 -- # local sorted_s 00:04:16.957 10:35:33 -- setup/hugepages.sh@92 -- # local surp 00:04:16.957 10:35:33 -- setup/hugepages.sh@93 -- # local resv 00:04:16.957 10:35:33 -- setup/hugepages.sh@94 -- # local anon 00:04:16.957 10:35:33 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:16.957 10:35:33 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:16.957 10:35:33 -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:16.957 10:35:33 -- setup/common.sh@18 -- # local node= 00:04:16.957 10:35:33 -- setup/common.sh@19 -- # local var val 00:04:16.957 10:35:33 -- setup/common.sh@20 -- # local mem_f mem 00:04:16.957 10:35:33 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:16.957 10:35:33 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:16.957 10:35:33 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:16.957 10:35:33 -- setup/common.sh@28 -- # mapfile -t mem 00:04:16.957 10:35:33 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:16.957 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.957 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.957 10:35:33 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 42622508 kB' 'MemAvailable: 45004604 kB' 'Buffers: 12536 kB' 'Cached: 11508596 kB' 'SwapCached: 16 kB' 'Active: 9755644 kB' 'Inactive: 2354388 kB' 'Active(anon): 9280292 kB' 'Inactive(anon): 57088 kB' 'Active(file): 475352 kB' 'Inactive(file): 2297300 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8387836 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 591624 kB' 'Mapped: 174948 kB' 'Shmem: 8748480 kB' 'KReclaimable: 245920 kB' 'Slab: 772920 kB' 'SReclaimable: 245920 kB' 'SUnreclaim: 527000 kB' 'KernelStack: 21904 kB' 'PageTables: 8444 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 10677736 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213620 kB' 'VmallocChunk: 0 kB' 'Percpu: 73472 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:04:16.957 10:35:33 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.957 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.957 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.957 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.957 10:35:33 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.957 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.957 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.957 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.957 10:35:33 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.957 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.957 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.957 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.957 10:35:33 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.957 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.957 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.957 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.957 10:35:33 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.957 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.957 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.957 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.957 10:35:33 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.957 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.957 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.957 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.957 10:35:33 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.957 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.957 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.957 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.957 10:35:33 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.957 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.957 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.957 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.957 10:35:33 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.957 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.957 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.958 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.958 10:35:33 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.958 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.958 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.958 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.958 10:35:33 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.958 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.958 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.958 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.958 10:35:33 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.958 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.958 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.958 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.958 10:35:33 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.958 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.958 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.958 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.958 10:35:33 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.958 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.958 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.958 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.958 10:35:33 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.958 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.958 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.958 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.958 10:35:33 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.958 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.958 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.958 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.958 10:35:33 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.958 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.958 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.958 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.958 10:35:33 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.958 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.958 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.958 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.958 10:35:33 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.958 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.958 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.958 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.958 10:35:33 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.958 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.958 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.958 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.958 10:35:33 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.958 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.958 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.958 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.958 10:35:33 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.958 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.958 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.958 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.958 10:35:33 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.958 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.958 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.958 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.958 10:35:33 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.958 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.958 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.958 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.958 10:35:33 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.958 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.958 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.958 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.958 10:35:33 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.958 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.958 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.958 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.958 10:35:33 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.958 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.958 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.958 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.958 10:35:33 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.958 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.958 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.958 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.958 10:35:33 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.958 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.958 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.958 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.958 10:35:33 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.958 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.958 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.958 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.958 10:35:33 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.958 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.958 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.958 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.958 10:35:33 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.958 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.958 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.958 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.958 10:35:33 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.958 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.958 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.958 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.958 10:35:33 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.958 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.958 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.958 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.958 10:35:33 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.958 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.958 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.958 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.958 10:35:33 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.958 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.958 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.958 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.958 10:35:33 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.958 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.958 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.958 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.958 10:35:33 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.958 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.958 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.958 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.958 10:35:33 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.958 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.958 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.958 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.958 10:35:33 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.958 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.958 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.958 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.958 10:35:33 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.958 10:35:33 -- setup/common.sh@33 -- # echo 0 00:04:16.958 10:35:33 -- setup/common.sh@33 -- # return 0 00:04:16.958 10:35:33 -- setup/hugepages.sh@97 -- # anon=0 00:04:16.958 10:35:33 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:16.958 10:35:33 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:16.958 10:35:33 -- setup/common.sh@18 -- # local node= 00:04:16.958 10:35:33 -- setup/common.sh@19 -- # local var val 00:04:16.958 10:35:33 -- setup/common.sh@20 -- # local mem_f mem 00:04:16.958 10:35:33 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:16.958 10:35:33 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:16.958 10:35:33 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:16.958 10:35:33 -- setup/common.sh@28 -- # mapfile -t mem 00:04:16.958 10:35:33 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:16.958 10:35:33 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 42638408 kB' 'MemAvailable: 45020504 kB' 'Buffers: 12536 kB' 'Cached: 11508596 kB' 'SwapCached: 16 kB' 'Active: 9751364 kB' 'Inactive: 2354388 kB' 'Active(anon): 9276012 kB' 'Inactive(anon): 57088 kB' 'Active(file): 475352 kB' 'Inactive(file): 2297300 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8387836 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 587996 kB' 'Mapped: 175164 kB' 'Shmem: 8748480 kB' 'KReclaimable: 245920 kB' 'Slab: 772624 kB' 'SReclaimable: 245920 kB' 'SUnreclaim: 526704 kB' 'KernelStack: 21968 kB' 'PageTables: 8596 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 10673920 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213604 kB' 'VmallocChunk: 0 kB' 'Percpu: 73472 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:04:16.958 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.958 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.958 10:35:33 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.958 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.958 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.958 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.958 10:35:33 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.958 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.958 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.958 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.958 10:35:33 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.958 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.958 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.959 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.959 10:35:33 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.959 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.959 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.959 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.959 10:35:33 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.959 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.959 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.959 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.959 10:35:33 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.959 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.959 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.959 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.959 10:35:33 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.959 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.959 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.959 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.959 10:35:33 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.959 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.959 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.959 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.959 10:35:33 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.959 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.959 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.959 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.959 10:35:33 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.959 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.959 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.959 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.959 10:35:33 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.959 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.959 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.959 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.959 10:35:33 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.959 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.959 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.959 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.959 10:35:33 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.959 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.959 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.959 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.959 10:35:33 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.959 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.959 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.959 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.959 10:35:33 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.959 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.959 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.959 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.959 10:35:33 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.959 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.959 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.959 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.959 10:35:33 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.959 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.959 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.959 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.959 10:35:33 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.959 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.959 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.959 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.959 10:35:33 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.959 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.959 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.959 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.959 10:35:33 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.959 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.959 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.959 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.959 10:35:33 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.959 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.959 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.959 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.959 10:35:33 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.959 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.959 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.959 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.959 10:35:33 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.959 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.959 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.959 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.959 10:35:33 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.959 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.959 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.959 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.959 10:35:33 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.959 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.959 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.959 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.959 10:35:33 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.959 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.959 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.959 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.959 10:35:33 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.959 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.959 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.959 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.959 10:35:33 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.959 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.959 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.959 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.959 10:35:33 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.959 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.959 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.959 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.959 10:35:33 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.959 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.959 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.959 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.959 10:35:33 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.959 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.959 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.959 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.959 10:35:33 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.959 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.959 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.959 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.959 10:35:33 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.959 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.959 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.959 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.959 10:35:33 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.959 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.959 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.959 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.959 10:35:33 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.959 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.959 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.959 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.959 10:35:33 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.959 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.959 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.959 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.959 10:35:33 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.959 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.959 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.959 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.959 10:35:33 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.959 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.959 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.959 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.959 10:35:33 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.959 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.959 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.959 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.959 10:35:33 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.959 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.959 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.959 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.959 10:35:33 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.959 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.959 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.959 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.959 10:35:33 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.959 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.959 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.959 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.959 10:35:33 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.959 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.959 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.959 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.959 10:35:33 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.959 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.959 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.959 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.959 10:35:33 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.959 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.959 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.959 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.959 10:35:33 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.959 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.960 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.960 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.960 10:35:33 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.960 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.960 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.960 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.960 10:35:33 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.960 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.960 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.960 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.960 10:35:33 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.960 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.960 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.960 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.960 10:35:33 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.960 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.960 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.960 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.960 10:35:33 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.960 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.960 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.960 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.960 10:35:33 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.960 10:35:33 -- setup/common.sh@33 -- # echo 0 00:04:16.960 10:35:33 -- setup/common.sh@33 -- # return 0 00:04:16.960 10:35:33 -- setup/hugepages.sh@99 -- # surp=0 00:04:16.960 10:35:33 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:16.960 10:35:33 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:16.960 10:35:33 -- setup/common.sh@18 -- # local node= 00:04:16.960 10:35:33 -- setup/common.sh@19 -- # local var val 00:04:16.960 10:35:33 -- setup/common.sh@20 -- # local mem_f mem 00:04:16.960 10:35:33 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:16.960 10:35:33 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:16.960 10:35:33 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:16.960 10:35:33 -- setup/common.sh@28 -- # mapfile -t mem 00:04:16.960 10:35:33 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:16.960 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.960 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.960 10:35:33 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 42633428 kB' 'MemAvailable: 45015524 kB' 'Buffers: 12536 kB' 'Cached: 11508608 kB' 'SwapCached: 16 kB' 'Active: 9753728 kB' 'Inactive: 2354388 kB' 'Active(anon): 9278376 kB' 'Inactive(anon): 57088 kB' 'Active(file): 475352 kB' 'Inactive(file): 2297300 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8387836 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 590180 kB' 'Mapped: 174820 kB' 'Shmem: 8748492 kB' 'KReclaimable: 245920 kB' 'Slab: 772624 kB' 'SReclaimable: 245920 kB' 'SUnreclaim: 526704 kB' 'KernelStack: 21968 kB' 'PageTables: 8348 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 10677232 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213572 kB' 'VmallocChunk: 0 kB' 'Percpu: 73472 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:04:16.960 10:35:33 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.960 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.960 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.960 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.960 10:35:33 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.960 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.960 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.960 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.960 10:35:33 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.960 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.960 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.960 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.960 10:35:33 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.960 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.960 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.960 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.960 10:35:33 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.960 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.960 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.960 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.960 10:35:33 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.960 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.960 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.960 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.960 10:35:33 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.960 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.960 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.960 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.960 10:35:33 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.960 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.960 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.960 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.960 10:35:33 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.960 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.960 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.960 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.960 10:35:33 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.960 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.960 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.960 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.960 10:35:33 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.960 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.960 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.960 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.960 10:35:33 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.960 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.960 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.960 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.960 10:35:33 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.960 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.960 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.960 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.960 10:35:33 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.960 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.960 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.960 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.960 10:35:33 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.960 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.960 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.960 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.960 10:35:33 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.960 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.960 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.960 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.960 10:35:33 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.960 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.960 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.960 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.960 10:35:33 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.960 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.960 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.960 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.960 10:35:33 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.960 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.960 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.960 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.960 10:35:33 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.960 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.960 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.960 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.960 10:35:33 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.960 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.960 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.960 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.960 10:35:33 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.960 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.960 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.960 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.960 10:35:33 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.960 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.960 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.960 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.960 10:35:33 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.960 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.960 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.960 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.960 10:35:33 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.960 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.960 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.960 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.960 10:35:33 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.960 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.960 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.960 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.960 10:35:33 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.960 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.960 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.960 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.960 10:35:33 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.960 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.960 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.960 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.960 10:35:33 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.960 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.960 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.960 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.960 10:35:33 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.960 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.961 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.961 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.961 10:35:33 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.961 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.961 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.961 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.961 10:35:33 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.961 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.961 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.961 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.961 10:35:33 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.961 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.961 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.961 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.961 10:35:33 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.961 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.961 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.961 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.961 10:35:33 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.961 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.961 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.961 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.961 10:35:33 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.961 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.961 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.961 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.961 10:35:33 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.961 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.961 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.961 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.961 10:35:33 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.961 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.961 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.961 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.961 10:35:33 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.961 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.961 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.961 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.961 10:35:33 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.961 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.961 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.961 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.961 10:35:33 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.961 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.961 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.961 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.961 10:35:33 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.961 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.961 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.961 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.961 10:35:33 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.961 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.961 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.961 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.961 10:35:33 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.961 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.961 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.961 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.961 10:35:33 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.961 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.961 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.961 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.961 10:35:33 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.961 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.961 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.961 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.961 10:35:33 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.961 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.961 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.961 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.961 10:35:33 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.961 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.961 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.961 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.961 10:35:33 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.961 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.961 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.961 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.961 10:35:33 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.961 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.961 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.961 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.961 10:35:33 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.961 10:35:33 -- setup/common.sh@33 -- # echo 0 00:04:16.961 10:35:33 -- setup/common.sh@33 -- # return 0 00:04:16.961 10:35:33 -- setup/hugepages.sh@100 -- # resv=0 00:04:16.961 10:35:33 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:16.961 nr_hugepages=1024 00:04:16.961 10:35:33 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:16.961 resv_hugepages=0 00:04:16.961 10:35:33 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:16.961 surplus_hugepages=0 00:04:16.961 10:35:33 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:16.961 anon_hugepages=0 00:04:16.961 10:35:33 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:16.961 10:35:33 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:16.961 10:35:33 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:16.961 10:35:33 -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:16.961 10:35:33 -- setup/common.sh@18 -- # local node= 00:04:16.961 10:35:33 -- setup/common.sh@19 -- # local var val 00:04:16.961 10:35:33 -- setup/common.sh@20 -- # local mem_f mem 00:04:16.961 10:35:33 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:16.961 10:35:33 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:16.961 10:35:33 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:16.961 10:35:33 -- setup/common.sh@28 -- # mapfile -t mem 00:04:16.961 10:35:33 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:16.961 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.961 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.961 10:35:33 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 42630336 kB' 'MemAvailable: 45012432 kB' 'Buffers: 12536 kB' 'Cached: 11508620 kB' 'SwapCached: 16 kB' 'Active: 9756608 kB' 'Inactive: 2354388 kB' 'Active(anon): 9281256 kB' 'Inactive(anon): 57088 kB' 'Active(file): 475352 kB' 'Inactive(file): 2297300 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8387836 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 593056 kB' 'Mapped: 175112 kB' 'Shmem: 8748504 kB' 'KReclaimable: 245920 kB' 'Slab: 772624 kB' 'SReclaimable: 245920 kB' 'SUnreclaim: 526704 kB' 'KernelStack: 22016 kB' 'PageTables: 8304 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 10678740 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213588 kB' 'VmallocChunk: 0 kB' 'Percpu: 73472 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:04:16.961 10:35:33 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.961 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.961 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.961 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.961 10:35:33 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.961 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.961 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.961 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.961 10:35:33 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.961 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.961 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.961 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.961 10:35:33 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.961 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.961 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.961 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.961 10:35:33 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.961 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.961 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.961 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.961 10:35:33 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.962 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.962 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.962 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.962 10:35:33 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.962 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.962 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.962 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.962 10:35:33 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.962 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.962 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.962 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.962 10:35:33 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.962 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.962 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.962 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.962 10:35:33 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.962 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.962 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.962 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.962 10:35:33 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.962 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.962 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.962 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.962 10:35:33 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.962 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.962 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.962 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.962 10:35:33 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.962 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.962 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.962 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.962 10:35:33 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.962 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.962 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.962 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.962 10:35:33 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.962 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.962 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.962 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.962 10:35:33 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.962 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.962 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.962 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.962 10:35:33 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.962 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.962 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.962 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.962 10:35:33 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.962 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.962 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.962 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.962 10:35:33 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.962 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.962 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.962 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.962 10:35:33 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.962 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.962 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.962 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.962 10:35:33 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.962 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.962 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.962 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.962 10:35:33 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.962 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.962 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.962 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.962 10:35:33 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.962 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.962 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.962 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.962 10:35:33 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.962 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.962 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.962 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.962 10:35:33 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.962 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.962 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.962 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.962 10:35:33 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.962 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.962 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.962 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.962 10:35:33 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.962 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.962 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.962 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.962 10:35:33 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.963 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.963 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.963 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.963 10:35:33 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.963 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.963 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.963 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.963 10:35:33 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.963 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.963 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.963 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.963 10:35:33 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.963 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.963 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.963 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.963 10:35:33 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.963 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.963 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.963 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.963 10:35:33 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.963 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.963 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.963 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.963 10:35:33 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.963 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.963 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.963 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.963 10:35:33 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.963 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.963 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.963 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.963 10:35:33 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.963 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.963 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.963 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.963 10:35:33 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.963 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.963 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.963 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.963 10:35:33 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.963 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.963 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.963 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.963 10:35:33 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.963 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.963 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.963 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.963 10:35:33 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.963 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.963 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.963 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.963 10:35:33 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.963 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.963 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.963 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.963 10:35:33 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.963 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.963 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.963 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.963 10:35:33 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.963 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.963 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.963 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.963 10:35:33 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.963 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.963 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.963 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.963 10:35:33 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.963 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.963 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.963 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.963 10:35:33 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.963 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.963 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.963 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.963 10:35:33 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.963 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.963 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.963 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.963 10:35:33 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.963 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.963 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.963 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.963 10:35:33 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.963 10:35:33 -- setup/common.sh@33 -- # echo 1024 00:04:16.963 10:35:33 -- setup/common.sh@33 -- # return 0 00:04:16.963 10:35:33 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:16.963 10:35:33 -- setup/hugepages.sh@112 -- # get_nodes 00:04:16.963 10:35:33 -- setup/hugepages.sh@27 -- # local node 00:04:16.963 10:35:33 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:16.963 10:35:33 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:16.963 10:35:33 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:16.963 10:35:33 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:16.963 10:35:33 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:16.963 10:35:33 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:16.963 10:35:33 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:16.963 10:35:33 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:16.963 10:35:33 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:16.963 10:35:33 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:16.963 10:35:33 -- setup/common.sh@18 -- # local node=0 00:04:16.963 10:35:33 -- setup/common.sh@19 -- # local var val 00:04:16.963 10:35:33 -- setup/common.sh@20 -- # local mem_f mem 00:04:16.963 10:35:33 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:16.963 10:35:33 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:16.963 10:35:33 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:16.963 10:35:33 -- setup/common.sh@28 -- # mapfile -t mem 00:04:16.963 10:35:33 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:16.963 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.963 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.963 10:35:33 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32592084 kB' 'MemFree: 26495984 kB' 'MemUsed: 6096100 kB' 'SwapCached: 16 kB' 'Active: 3391176 kB' 'Inactive: 180704 kB' 'Active(anon): 3174556 kB' 'Inactive(anon): 16 kB' 'Active(file): 216620 kB' 'Inactive(file): 180688 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3353792 kB' 'Mapped: 107140 kB' 'AnonPages: 221252 kB' 'Shmem: 2956468 kB' 'KernelStack: 12296 kB' 'PageTables: 4000 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 124976 kB' 'Slab: 366096 kB' 'SReclaimable: 124976 kB' 'SUnreclaim: 241120 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:16.963 10:35:33 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.963 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.963 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.963 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.963 10:35:33 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.963 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.963 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.963 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.963 10:35:33 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.963 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.963 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.963 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.963 10:35:33 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.963 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.963 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.963 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.963 10:35:33 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.963 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.963 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.963 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.963 10:35:33 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.963 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.963 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.963 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.963 10:35:33 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.963 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.963 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.963 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.963 10:35:33 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.963 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.963 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.963 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.963 10:35:33 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.963 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.963 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.963 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.963 10:35:33 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.963 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.963 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.963 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.963 10:35:33 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.963 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.963 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.963 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.963 10:35:33 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.964 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.964 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.964 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.964 10:35:33 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.964 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.964 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.964 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.964 10:35:33 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.964 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.964 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.964 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.964 10:35:33 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.964 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.964 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.964 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.964 10:35:33 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.964 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.964 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.964 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.964 10:35:33 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.964 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.964 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.964 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.964 10:35:33 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.964 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.964 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.964 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.964 10:35:33 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.964 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.964 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.964 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.964 10:35:33 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.964 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.964 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.964 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.964 10:35:33 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.964 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.964 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.964 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.964 10:35:33 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.964 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.964 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.964 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.964 10:35:33 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.964 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.964 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.964 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.964 10:35:33 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.964 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.964 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.964 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.964 10:35:33 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.964 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.964 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.964 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.964 10:35:33 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.964 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.964 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.964 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.964 10:35:33 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.964 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.964 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.964 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.964 10:35:33 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.964 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.964 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.964 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.964 10:35:33 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.964 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.964 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.964 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.964 10:35:33 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.964 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.964 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.964 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.964 10:35:33 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.964 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.964 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.964 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.964 10:35:33 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.964 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.964 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.964 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.964 10:35:33 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.964 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.964 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.964 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.964 10:35:33 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.964 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.964 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.964 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.964 10:35:33 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.964 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.964 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.964 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.964 10:35:33 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.964 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.964 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.964 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.964 10:35:33 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.964 10:35:33 -- setup/common.sh@33 -- # echo 0 00:04:16.964 10:35:33 -- setup/common.sh@33 -- # return 0 00:04:16.964 10:35:33 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:16.964 10:35:33 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:16.964 10:35:33 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:16.964 10:35:33 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:16.964 10:35:33 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:16.964 10:35:33 -- setup/common.sh@18 -- # local node=1 00:04:16.964 10:35:33 -- setup/common.sh@19 -- # local var val 00:04:16.964 10:35:33 -- setup/common.sh@20 -- # local mem_f mem 00:04:16.964 10:35:33 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:16.964 10:35:33 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:16.964 10:35:33 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:16.964 10:35:33 -- setup/common.sh@28 -- # mapfile -t mem 00:04:16.964 10:35:33 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:16.964 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.964 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.964 10:35:33 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27703148 kB' 'MemFree: 16141412 kB' 'MemUsed: 11561736 kB' 'SwapCached: 0 kB' 'Active: 6359412 kB' 'Inactive: 2173684 kB' 'Active(anon): 6100680 kB' 'Inactive(anon): 57072 kB' 'Active(file): 258732 kB' 'Inactive(file): 2116612 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8167384 kB' 'Mapped: 67308 kB' 'AnonPages: 365764 kB' 'Shmem: 5792040 kB' 'KernelStack: 9432 kB' 'PageTables: 4000 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 120944 kB' 'Slab: 406528 kB' 'SReclaimable: 120944 kB' 'SUnreclaim: 285584 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:16.964 10:35:33 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.964 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.964 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.964 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.964 10:35:33 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.964 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.964 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.964 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.964 10:35:33 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.964 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.964 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.964 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.964 10:35:33 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.964 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.964 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.964 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.964 10:35:33 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.964 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.964 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.964 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.964 10:35:33 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.964 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.964 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.964 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.964 10:35:33 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.964 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.964 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.964 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.964 10:35:33 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.964 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.964 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.964 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.964 10:35:33 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.964 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.964 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.964 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.964 10:35:33 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.964 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.964 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.964 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.964 10:35:33 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.964 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.964 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.965 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.965 10:35:33 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.965 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.965 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.965 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.965 10:35:33 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.965 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.965 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.965 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.965 10:35:33 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.965 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.965 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.965 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.965 10:35:33 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.965 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.965 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.965 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.965 10:35:33 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.965 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.965 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.965 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.965 10:35:33 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.965 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.965 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.965 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.965 10:35:33 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.965 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.965 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.965 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.965 10:35:33 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.965 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.965 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.965 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.965 10:35:33 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.965 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.965 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.965 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.965 10:35:33 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.965 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.965 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.965 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.965 10:35:33 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.965 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.965 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.965 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.965 10:35:33 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.965 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.965 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.965 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.965 10:35:33 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.965 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.965 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.965 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.965 10:35:33 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.965 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.965 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.965 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.965 10:35:33 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.965 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.965 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.965 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.965 10:35:33 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.965 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.965 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.965 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.965 10:35:33 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.965 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.965 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.965 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.965 10:35:33 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.965 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.965 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.965 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.965 10:35:33 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.965 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.965 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.965 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.965 10:35:33 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.965 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.965 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.965 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.965 10:35:33 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.965 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.965 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.965 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.965 10:35:33 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.965 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.965 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.965 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.965 10:35:33 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.965 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.965 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.965 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.965 10:35:33 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.965 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.965 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.965 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.965 10:35:33 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.965 10:35:33 -- setup/common.sh@32 -- # continue 00:04:16.965 10:35:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.965 10:35:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.965 10:35:33 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.965 10:35:33 -- setup/common.sh@33 -- # echo 0 00:04:16.965 10:35:33 -- setup/common.sh@33 -- # return 0 00:04:16.965 10:35:33 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:16.965 10:35:33 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:16.965 10:35:33 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:16.965 10:35:33 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:16.965 10:35:33 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:04:16.965 node0=512 expecting 512 00:04:16.965 10:35:33 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:16.965 10:35:33 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:16.965 10:35:33 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:16.965 10:35:33 -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:04:16.965 node1=512 expecting 512 00:04:16.965 10:35:33 -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:04:16.965 00:04:16.965 real 0m3.455s 00:04:16.965 user 0m1.334s 00:04:16.965 sys 0m2.184s 00:04:16.965 10:35:33 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:16.965 10:35:33 -- common/autotest_common.sh@10 -- # set +x 00:04:16.965 ************************************ 00:04:16.965 END TEST per_node_1G_alloc 00:04:16.965 ************************************ 00:04:16.965 10:35:33 -- setup/hugepages.sh@212 -- # run_test even_2G_alloc even_2G_alloc 00:04:16.965 10:35:33 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:16.965 10:35:33 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:16.965 10:35:33 -- common/autotest_common.sh@10 -- # set +x 00:04:16.965 ************************************ 00:04:16.965 START TEST even_2G_alloc 00:04:16.965 ************************************ 00:04:16.965 10:35:33 -- common/autotest_common.sh@1104 -- # even_2G_alloc 00:04:16.965 10:35:33 -- setup/hugepages.sh@152 -- # get_test_nr_hugepages 2097152 00:04:16.965 10:35:33 -- setup/hugepages.sh@49 -- # local size=2097152 00:04:16.965 10:35:33 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:16.965 10:35:33 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:16.965 10:35:33 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:16.965 10:35:33 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:16.965 10:35:33 -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:16.965 10:35:33 -- setup/hugepages.sh@62 -- # local user_nodes 00:04:16.965 10:35:33 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:16.965 10:35:33 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:16.965 10:35:33 -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:16.965 10:35:33 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:16.965 10:35:33 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:16.965 10:35:33 -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:04:16.965 10:35:33 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:16.965 10:35:33 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:04:16.965 10:35:33 -- setup/hugepages.sh@83 -- # : 512 00:04:16.965 10:35:33 -- setup/hugepages.sh@84 -- # : 1 00:04:16.965 10:35:33 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:16.965 10:35:33 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:04:16.965 10:35:33 -- setup/hugepages.sh@83 -- # : 0 00:04:16.965 10:35:33 -- setup/hugepages.sh@84 -- # : 0 00:04:16.965 10:35:33 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:16.965 10:35:33 -- setup/hugepages.sh@153 -- # NRHUGE=1024 00:04:16.965 10:35:33 -- setup/hugepages.sh@153 -- # HUGE_EVEN_ALLOC=yes 00:04:16.965 10:35:33 -- setup/hugepages.sh@153 -- # setup output 00:04:16.965 10:35:33 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:16.965 10:35:33 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:20.261 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:20.261 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:20.261 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:20.261 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:20.261 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:20.261 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:20.261 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:20.261 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:20.261 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:20.261 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:20.261 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:20.261 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:20.261 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:20.261 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:20.261 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:20.261 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:20.261 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:20.261 10:35:36 -- setup/hugepages.sh@154 -- # verify_nr_hugepages 00:04:20.261 10:35:36 -- setup/hugepages.sh@89 -- # local node 00:04:20.261 10:35:36 -- setup/hugepages.sh@90 -- # local sorted_t 00:04:20.261 10:35:36 -- setup/hugepages.sh@91 -- # local sorted_s 00:04:20.261 10:35:36 -- setup/hugepages.sh@92 -- # local surp 00:04:20.261 10:35:36 -- setup/hugepages.sh@93 -- # local resv 00:04:20.261 10:35:36 -- setup/hugepages.sh@94 -- # local anon 00:04:20.261 10:35:36 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:20.261 10:35:36 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:20.261 10:35:36 -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:20.261 10:35:36 -- setup/common.sh@18 -- # local node= 00:04:20.261 10:35:36 -- setup/common.sh@19 -- # local var val 00:04:20.261 10:35:36 -- setup/common.sh@20 -- # local mem_f mem 00:04:20.261 10:35:36 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:20.261 10:35:36 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:20.261 10:35:36 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:20.261 10:35:36 -- setup/common.sh@28 -- # mapfile -t mem 00:04:20.261 10:35:36 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:20.261 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.261 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.261 10:35:36 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 42660036 kB' 'MemAvailable: 45042132 kB' 'Buffers: 12536 kB' 'Cached: 11508724 kB' 'SwapCached: 16 kB' 'Active: 9751340 kB' 'Inactive: 2354388 kB' 'Active(anon): 9275988 kB' 'Inactive(anon): 57088 kB' 'Active(file): 475352 kB' 'Inactive(file): 2297300 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8387836 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 587712 kB' 'Mapped: 174848 kB' 'Shmem: 8748608 kB' 'KReclaimable: 245920 kB' 'Slab: 772964 kB' 'SReclaimable: 245920 kB' 'SUnreclaim: 527044 kB' 'KernelStack: 21840 kB' 'PageTables: 8028 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 10670568 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213412 kB' 'VmallocChunk: 0 kB' 'Percpu: 73472 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:04:20.261 10:35:36 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.261 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.261 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.261 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.262 10:35:36 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.262 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.262 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.262 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.262 10:35:36 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.262 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.262 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.262 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.262 10:35:36 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.262 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.262 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.262 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.262 10:35:36 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.262 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.262 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.262 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.262 10:35:36 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.262 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.262 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.262 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.262 10:35:36 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.262 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.262 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.262 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.262 10:35:36 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.262 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.262 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.262 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.262 10:35:36 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.262 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.262 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.262 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.262 10:35:36 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.262 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.262 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.262 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.262 10:35:36 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.262 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.262 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.262 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.262 10:35:36 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.262 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.262 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.262 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.262 10:35:36 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.262 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.262 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.262 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.262 10:35:36 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.262 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.262 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.262 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.262 10:35:36 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.262 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.262 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.262 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.262 10:35:36 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.262 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.262 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.262 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.262 10:35:36 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.262 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.262 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.262 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.262 10:35:36 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.262 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.262 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.262 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.262 10:35:36 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.262 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.262 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.262 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.262 10:35:36 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.262 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.262 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.262 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.262 10:35:36 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.262 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.262 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.262 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.262 10:35:36 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.262 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.262 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.262 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.262 10:35:36 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.262 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.262 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.262 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.262 10:35:36 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.262 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.262 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.262 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.262 10:35:36 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.262 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.262 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.262 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.262 10:35:36 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.262 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.262 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.262 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.262 10:35:36 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.262 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.262 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.262 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.262 10:35:36 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.262 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.262 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.262 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.262 10:35:36 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.262 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.262 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.262 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.262 10:35:36 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.262 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.262 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.262 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.262 10:35:36 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.262 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.262 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.262 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.262 10:35:36 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.262 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.262 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.262 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.262 10:35:36 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.262 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.262 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.262 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.262 10:35:36 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.262 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.262 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.262 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.262 10:35:36 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.262 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.262 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.262 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.262 10:35:36 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.262 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.262 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.262 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.262 10:35:36 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.262 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.262 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.262 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.262 10:35:36 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.262 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.262 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.262 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.262 10:35:36 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.262 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.262 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.262 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.262 10:35:36 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.262 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.262 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.262 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.262 10:35:36 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.262 10:35:36 -- setup/common.sh@33 -- # echo 0 00:04:20.262 10:35:36 -- setup/common.sh@33 -- # return 0 00:04:20.262 10:35:36 -- setup/hugepages.sh@97 -- # anon=0 00:04:20.262 10:35:36 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:20.262 10:35:36 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:20.262 10:35:36 -- setup/common.sh@18 -- # local node= 00:04:20.262 10:35:36 -- setup/common.sh@19 -- # local var val 00:04:20.262 10:35:36 -- setup/common.sh@20 -- # local mem_f mem 00:04:20.262 10:35:36 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:20.262 10:35:36 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:20.262 10:35:36 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:20.262 10:35:36 -- setup/common.sh@28 -- # mapfile -t mem 00:04:20.262 10:35:36 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:20.262 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.262 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.263 10:35:36 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 42659588 kB' 'MemAvailable: 45041684 kB' 'Buffers: 12536 kB' 'Cached: 11508724 kB' 'SwapCached: 16 kB' 'Active: 9752996 kB' 'Inactive: 2354388 kB' 'Active(anon): 9277644 kB' 'Inactive(anon): 57088 kB' 'Active(file): 475352 kB' 'Inactive(file): 2297300 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8387836 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 589408 kB' 'Mapped: 174848 kB' 'Shmem: 8748608 kB' 'KReclaimable: 245920 kB' 'Slab: 772956 kB' 'SReclaimable: 245920 kB' 'SUnreclaim: 527036 kB' 'KernelStack: 21776 kB' 'PageTables: 7820 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 10671900 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213380 kB' 'VmallocChunk: 0 kB' 'Percpu: 73472 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:04:20.263 10:35:36 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.263 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.263 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.263 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.263 10:35:36 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.263 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.263 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.263 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.263 10:35:36 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.263 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.263 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.263 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.263 10:35:36 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.263 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.263 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.263 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.263 10:35:36 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.263 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.263 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.263 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.263 10:35:36 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.263 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.263 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.263 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.263 10:35:36 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.263 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.263 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.263 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.263 10:35:36 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.263 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.263 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.263 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.263 10:35:36 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.263 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.263 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.263 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.263 10:35:36 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.263 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.263 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.263 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.263 10:35:36 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.263 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.263 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.263 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.263 10:35:36 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.263 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.263 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.263 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.263 10:35:36 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.263 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.263 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.263 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.263 10:35:36 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.263 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.263 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.263 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.263 10:35:36 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.263 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.263 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.263 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.263 10:35:36 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.263 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.263 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.263 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.263 10:35:36 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.263 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.263 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.263 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.263 10:35:36 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.263 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.263 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.263 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.263 10:35:36 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.263 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.263 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.263 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.263 10:35:36 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.263 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.263 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.263 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.263 10:35:36 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.263 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.263 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.263 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.263 10:35:36 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.263 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.263 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.263 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.263 10:35:36 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.263 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.263 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.263 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.263 10:35:36 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.263 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.263 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.263 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.263 10:35:36 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.263 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.263 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.263 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.263 10:35:36 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.263 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.263 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.263 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.263 10:35:36 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.263 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.263 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.263 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.263 10:35:36 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.263 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.263 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.263 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.263 10:35:36 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.263 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.263 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.263 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.263 10:35:36 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.263 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.263 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.263 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.263 10:35:36 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.263 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.263 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.263 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.263 10:35:36 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.263 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.263 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.263 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.263 10:35:36 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.263 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.263 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.263 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.263 10:35:36 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.263 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.263 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.263 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.263 10:35:36 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.263 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.263 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.263 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.263 10:35:36 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.263 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.263 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.263 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.263 10:35:36 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.263 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.263 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.263 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.263 10:35:36 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.263 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.263 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.263 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.263 10:35:36 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.263 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.263 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.263 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.264 10:35:36 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.264 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.264 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.264 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.264 10:35:36 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.264 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.264 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.264 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.264 10:35:36 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.264 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.264 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.264 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.264 10:35:36 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.264 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.264 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.264 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.264 10:35:36 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.264 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.264 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.264 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.264 10:35:36 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.264 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.264 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.264 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.264 10:35:36 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.264 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.264 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.264 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.264 10:35:36 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.264 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.264 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.264 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.264 10:35:36 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.264 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.264 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.264 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.264 10:35:36 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.264 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.264 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.264 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.264 10:35:36 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.264 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.264 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.264 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.264 10:35:36 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.264 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.264 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.264 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.264 10:35:36 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.264 10:35:36 -- setup/common.sh@33 -- # echo 0 00:04:20.264 10:35:36 -- setup/common.sh@33 -- # return 0 00:04:20.264 10:35:36 -- setup/hugepages.sh@99 -- # surp=0 00:04:20.264 10:35:36 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:20.264 10:35:36 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:20.264 10:35:36 -- setup/common.sh@18 -- # local node= 00:04:20.264 10:35:36 -- setup/common.sh@19 -- # local var val 00:04:20.264 10:35:36 -- setup/common.sh@20 -- # local mem_f mem 00:04:20.264 10:35:36 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:20.264 10:35:36 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:20.264 10:35:36 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:20.264 10:35:36 -- setup/common.sh@28 -- # mapfile -t mem 00:04:20.264 10:35:36 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:20.264 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.264 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.264 10:35:36 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 42654544 kB' 'MemAvailable: 45036640 kB' 'Buffers: 12536 kB' 'Cached: 11508736 kB' 'SwapCached: 16 kB' 'Active: 9755964 kB' 'Inactive: 2354388 kB' 'Active(anon): 9280612 kB' 'Inactive(anon): 57088 kB' 'Active(file): 475352 kB' 'Inactive(file): 2297300 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8387836 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 592316 kB' 'Mapped: 175176 kB' 'Shmem: 8748620 kB' 'KReclaimable: 245920 kB' 'Slab: 772996 kB' 'SReclaimable: 245920 kB' 'SUnreclaim: 527076 kB' 'KernelStack: 21776 kB' 'PageTables: 7836 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 10675228 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213384 kB' 'VmallocChunk: 0 kB' 'Percpu: 73472 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:04:20.264 10:35:36 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.264 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.264 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.264 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.264 10:35:36 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.264 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.264 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.264 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.264 10:35:36 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.264 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.264 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.264 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.264 10:35:36 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.264 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.264 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.264 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.264 10:35:36 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.264 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.264 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.264 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.264 10:35:36 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.264 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.264 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.264 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.264 10:35:36 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.264 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.264 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.264 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.264 10:35:36 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.264 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.264 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.264 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.264 10:35:36 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.264 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.264 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.264 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.264 10:35:36 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.264 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.264 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.264 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.264 10:35:36 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.264 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.264 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.264 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.264 10:35:36 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.264 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.264 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.264 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.264 10:35:36 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.264 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.264 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.264 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.264 10:35:36 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.264 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.264 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.264 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.264 10:35:36 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.264 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.264 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.264 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.264 10:35:36 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.264 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.264 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.264 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.264 10:35:36 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.264 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.264 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.264 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.264 10:35:36 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.264 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.264 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.264 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.264 10:35:36 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.264 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.264 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.264 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.264 10:35:36 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.264 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.264 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.264 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.264 10:35:36 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.264 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.264 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.264 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.264 10:35:36 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.264 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.264 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.264 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.264 10:35:36 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.265 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.265 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.265 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.265 10:35:36 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.265 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.265 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.265 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.265 10:35:36 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.265 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.265 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.265 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.265 10:35:36 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.265 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.265 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.265 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.265 10:35:36 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.265 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.265 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.265 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.265 10:35:36 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.265 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.265 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.265 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.265 10:35:36 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.265 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.265 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.265 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.265 10:35:36 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.265 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.265 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.265 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.265 10:35:36 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.265 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.265 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.265 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.265 10:35:36 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.265 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.265 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.265 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.265 10:35:36 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.265 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.265 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.265 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.265 10:35:36 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.265 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.265 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.265 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.265 10:35:36 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.265 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.265 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.265 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.265 10:35:36 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.265 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.265 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.265 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.265 10:35:36 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.265 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.265 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.265 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.265 10:35:36 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.265 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.265 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.265 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.265 10:35:36 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.265 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.265 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.265 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.265 10:35:36 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.265 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.265 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.265 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.265 10:35:36 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.265 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.265 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.265 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.265 10:35:36 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.265 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.265 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.265 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.265 10:35:36 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.265 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.265 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.265 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.265 10:35:36 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.265 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.265 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.265 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.265 10:35:36 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.265 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.265 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.265 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.265 10:35:36 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.265 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.265 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.265 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.265 10:35:36 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.265 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.265 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.265 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.265 10:35:36 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.265 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.265 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.265 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.265 10:35:36 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.265 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.265 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.265 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.265 10:35:36 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.265 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.265 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.265 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.265 10:35:36 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.265 10:35:36 -- setup/common.sh@33 -- # echo 0 00:04:20.265 10:35:36 -- setup/common.sh@33 -- # return 0 00:04:20.265 10:35:36 -- setup/hugepages.sh@100 -- # resv=0 00:04:20.265 10:35:36 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:20.265 nr_hugepages=1024 00:04:20.265 10:35:36 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:20.265 resv_hugepages=0 00:04:20.265 10:35:36 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:20.265 surplus_hugepages=0 00:04:20.265 10:35:36 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:20.265 anon_hugepages=0 00:04:20.265 10:35:36 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:20.265 10:35:36 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:20.265 10:35:36 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:20.265 10:35:36 -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:20.265 10:35:36 -- setup/common.sh@18 -- # local node= 00:04:20.265 10:35:36 -- setup/common.sh@19 -- # local var val 00:04:20.265 10:35:36 -- setup/common.sh@20 -- # local mem_f mem 00:04:20.265 10:35:36 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:20.265 10:35:36 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:20.265 10:35:36 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:20.265 10:35:36 -- setup/common.sh@28 -- # mapfile -t mem 00:04:20.265 10:35:36 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:20.265 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.265 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.265 10:35:36 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 42660976 kB' 'MemAvailable: 45043072 kB' 'Buffers: 12536 kB' 'Cached: 11508748 kB' 'SwapCached: 16 kB' 'Active: 9751268 kB' 'Inactive: 2354388 kB' 'Active(anon): 9275916 kB' 'Inactive(anon): 57088 kB' 'Active(file): 475352 kB' 'Inactive(file): 2297300 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8387836 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 587636 kB' 'Mapped: 174828 kB' 'Shmem: 8748632 kB' 'KReclaimable: 245920 kB' 'Slab: 772976 kB' 'SReclaimable: 245920 kB' 'SUnreclaim: 527056 kB' 'KernelStack: 21792 kB' 'PageTables: 7876 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 10670872 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213364 kB' 'VmallocChunk: 0 kB' 'Percpu: 73472 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:04:20.265 10:35:36 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.265 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.265 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.265 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.265 10:35:36 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.265 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.265 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.265 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.265 10:35:36 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.265 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.265 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.265 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.265 10:35:36 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.265 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.265 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.265 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.265 10:35:36 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.265 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.265 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.266 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.266 10:35:36 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.266 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.266 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.266 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.266 10:35:36 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.266 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.266 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.266 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.266 10:35:36 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.266 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.266 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.266 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.266 10:35:36 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.266 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.266 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.266 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.266 10:35:36 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.266 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.266 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.266 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.266 10:35:36 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.266 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.266 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.266 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.266 10:35:36 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.266 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.266 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.266 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.266 10:35:36 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.266 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.266 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.266 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.266 10:35:36 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.266 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.266 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.266 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.266 10:35:36 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.266 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.266 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.266 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.266 10:35:36 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.266 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.266 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.266 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.266 10:35:36 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.266 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.266 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.266 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.266 10:35:36 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.266 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.266 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.266 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.266 10:35:36 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.266 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.266 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.266 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.266 10:35:36 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.266 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.266 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.266 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.266 10:35:36 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.266 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.266 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.266 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.266 10:35:36 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.266 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.266 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.266 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.266 10:35:36 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.266 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.266 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.266 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.266 10:35:36 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.266 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.266 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.266 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.266 10:35:36 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.266 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.266 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.266 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.266 10:35:36 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.266 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.266 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.266 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.266 10:35:36 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.266 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.266 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.266 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.266 10:35:36 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.266 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.266 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.266 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.266 10:35:36 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.266 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.266 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.266 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.266 10:35:36 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.266 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.266 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.266 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.266 10:35:36 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.266 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.266 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.266 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.266 10:35:36 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.266 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.266 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.266 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.266 10:35:36 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.266 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.266 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.266 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.266 10:35:36 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.266 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.266 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.266 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.266 10:35:36 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.266 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.266 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.266 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.266 10:35:36 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.266 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.266 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.266 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.266 10:35:36 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.266 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.266 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.266 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.266 10:35:36 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.266 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.266 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.266 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.266 10:35:36 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.266 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.266 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.266 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.267 10:35:36 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.267 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.267 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.267 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.267 10:35:36 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.267 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.267 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.267 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.267 10:35:36 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.267 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.267 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.267 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.267 10:35:36 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.267 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.267 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.267 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.267 10:35:36 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.267 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.267 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.267 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.267 10:35:36 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.267 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.267 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.267 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.267 10:35:36 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.267 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.267 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.267 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.267 10:35:36 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.267 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.267 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.267 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.267 10:35:36 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.267 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.267 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.267 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.267 10:35:36 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.267 10:35:36 -- setup/common.sh@33 -- # echo 1024 00:04:20.267 10:35:36 -- setup/common.sh@33 -- # return 0 00:04:20.267 10:35:36 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:20.267 10:35:36 -- setup/hugepages.sh@112 -- # get_nodes 00:04:20.267 10:35:36 -- setup/hugepages.sh@27 -- # local node 00:04:20.267 10:35:36 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:20.267 10:35:36 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:20.267 10:35:36 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:20.267 10:35:36 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:20.267 10:35:36 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:20.267 10:35:36 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:20.267 10:35:36 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:20.267 10:35:36 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:20.267 10:35:36 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:20.267 10:35:36 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:20.267 10:35:36 -- setup/common.sh@18 -- # local node=0 00:04:20.267 10:35:36 -- setup/common.sh@19 -- # local var val 00:04:20.267 10:35:36 -- setup/common.sh@20 -- # local mem_f mem 00:04:20.267 10:35:36 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:20.267 10:35:36 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:20.267 10:35:36 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:20.267 10:35:36 -- setup/common.sh@28 -- # mapfile -t mem 00:04:20.267 10:35:36 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:20.267 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.267 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.267 10:35:36 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32592084 kB' 'MemFree: 26498748 kB' 'MemUsed: 6093336 kB' 'SwapCached: 16 kB' 'Active: 3390792 kB' 'Inactive: 180704 kB' 'Active(anon): 3174172 kB' 'Inactive(anon): 16 kB' 'Active(file): 216620 kB' 'Inactive(file): 180688 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3353868 kB' 'Mapped: 107520 kB' 'AnonPages: 220796 kB' 'Shmem: 2956544 kB' 'KernelStack: 12248 kB' 'PageTables: 3852 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 124976 kB' 'Slab: 366308 kB' 'SReclaimable: 124976 kB' 'SUnreclaim: 241332 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:20.267 10:35:36 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.267 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.267 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.267 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.267 10:35:36 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.267 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.267 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.267 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.267 10:35:36 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.267 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.267 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.267 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.267 10:35:36 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.267 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.267 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.267 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.267 10:35:36 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.267 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.267 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.267 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.267 10:35:36 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.267 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.267 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.267 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.267 10:35:36 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.267 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.267 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.267 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.267 10:35:36 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.267 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.267 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.267 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.267 10:35:36 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.267 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.267 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.267 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.267 10:35:36 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.267 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.267 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.267 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.267 10:35:36 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.267 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.267 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.267 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.267 10:35:36 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.267 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.267 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.267 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.267 10:35:36 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.267 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.267 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.267 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.267 10:35:36 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.267 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.267 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.267 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.267 10:35:36 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.267 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.267 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.267 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.267 10:35:36 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.267 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.267 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.267 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.267 10:35:36 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.267 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.267 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.267 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.267 10:35:36 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.267 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.267 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.267 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.267 10:35:36 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.267 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.267 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.267 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.267 10:35:36 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.267 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.267 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.267 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.267 10:35:36 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.267 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.267 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.267 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.267 10:35:36 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.267 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.267 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.267 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.267 10:35:36 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.267 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.267 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.267 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.267 10:35:36 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.267 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.267 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.267 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.267 10:35:36 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.268 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.268 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.268 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.268 10:35:36 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.268 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.268 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.268 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.268 10:35:36 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.268 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.268 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.268 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.268 10:35:36 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.268 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.268 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.268 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.268 10:35:36 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.268 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.268 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.268 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.268 10:35:36 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.268 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.268 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.268 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.268 10:35:36 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.268 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.268 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.268 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.268 10:35:36 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.268 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.268 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.268 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.268 10:35:36 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.268 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.268 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.268 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.268 10:35:36 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.268 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.268 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.268 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.268 10:35:36 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.268 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.268 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.268 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.268 10:35:36 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.268 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.268 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.268 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.268 10:35:36 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.268 10:35:36 -- setup/common.sh@33 -- # echo 0 00:04:20.268 10:35:36 -- setup/common.sh@33 -- # return 0 00:04:20.268 10:35:36 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:20.268 10:35:36 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:20.268 10:35:36 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:20.268 10:35:36 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:20.268 10:35:36 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:20.268 10:35:36 -- setup/common.sh@18 -- # local node=1 00:04:20.268 10:35:36 -- setup/common.sh@19 -- # local var val 00:04:20.268 10:35:36 -- setup/common.sh@20 -- # local mem_f mem 00:04:20.268 10:35:36 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:20.268 10:35:36 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:20.268 10:35:36 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:20.268 10:35:36 -- setup/common.sh@28 -- # mapfile -t mem 00:04:20.268 10:35:36 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:20.268 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.268 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.268 10:35:36 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27703148 kB' 'MemFree: 16162228 kB' 'MemUsed: 11540920 kB' 'SwapCached: 0 kB' 'Active: 6365676 kB' 'Inactive: 2173684 kB' 'Active(anon): 6106944 kB' 'Inactive(anon): 57072 kB' 'Active(file): 258732 kB' 'Inactive(file): 2116612 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8167432 kB' 'Mapped: 67460 kB' 'AnonPages: 372080 kB' 'Shmem: 5792088 kB' 'KernelStack: 9560 kB' 'PageTables: 4096 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 120944 kB' 'Slab: 406668 kB' 'SReclaimable: 120944 kB' 'SUnreclaim: 285724 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:20.268 10:35:36 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.268 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.268 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.268 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.268 10:35:36 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.268 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.268 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.268 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.268 10:35:36 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.268 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.268 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.268 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.268 10:35:36 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.268 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.268 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.268 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.268 10:35:36 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.268 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.268 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.268 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.268 10:35:36 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.268 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.268 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.268 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.268 10:35:36 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.268 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.268 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.268 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.268 10:35:36 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.268 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.268 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.268 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.268 10:35:36 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.268 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.268 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.268 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.268 10:35:36 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.268 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.268 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.268 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.268 10:35:36 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.268 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.268 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.268 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.268 10:35:36 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.268 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.268 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.268 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.268 10:35:36 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.268 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.268 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.268 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.268 10:35:36 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.268 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.268 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.268 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.268 10:35:36 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.268 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.268 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.268 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.268 10:35:36 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.268 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.268 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.268 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.268 10:35:36 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.268 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.268 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.268 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.268 10:35:36 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.268 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.268 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.268 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.268 10:35:36 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.268 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.268 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.268 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.268 10:35:36 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.268 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.268 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.268 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.268 10:35:36 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.268 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.268 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.268 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.268 10:35:36 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.268 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.268 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.268 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.268 10:35:36 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.268 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.268 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.268 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.268 10:35:36 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.268 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.268 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.268 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.268 10:35:36 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.269 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.269 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.269 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.269 10:35:36 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.269 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.269 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.269 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.269 10:35:36 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.269 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.269 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.269 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.269 10:35:36 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.269 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.269 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.269 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.269 10:35:36 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.269 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.269 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.269 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.269 10:35:36 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.269 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.269 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.269 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.269 10:35:36 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.269 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.269 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.269 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.269 10:35:36 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.269 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.269 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.269 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.269 10:35:36 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.269 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.269 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.269 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.269 10:35:36 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.269 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.269 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.269 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.269 10:35:36 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.269 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.269 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.269 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.269 10:35:36 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.269 10:35:36 -- setup/common.sh@32 -- # continue 00:04:20.269 10:35:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.269 10:35:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.269 10:35:36 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.269 10:35:36 -- setup/common.sh@33 -- # echo 0 00:04:20.269 10:35:36 -- setup/common.sh@33 -- # return 0 00:04:20.269 10:35:36 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:20.269 10:35:36 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:20.269 10:35:36 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:20.269 10:35:36 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:20.269 10:35:36 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:04:20.269 node0=512 expecting 512 00:04:20.269 10:35:36 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:20.269 10:35:36 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:20.269 10:35:36 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:20.269 10:35:36 -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:04:20.269 node1=512 expecting 512 00:04:20.269 10:35:36 -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:04:20.269 00:04:20.269 real 0m3.345s 00:04:20.269 user 0m1.257s 00:04:20.269 sys 0m2.147s 00:04:20.269 10:35:36 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:20.269 10:35:36 -- common/autotest_common.sh@10 -- # set +x 00:04:20.269 ************************************ 00:04:20.269 END TEST even_2G_alloc 00:04:20.269 ************************************ 00:04:20.269 10:35:36 -- setup/hugepages.sh@213 -- # run_test odd_alloc odd_alloc 00:04:20.269 10:35:36 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:20.269 10:35:36 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:20.269 10:35:36 -- common/autotest_common.sh@10 -- # set +x 00:04:20.269 ************************************ 00:04:20.269 START TEST odd_alloc 00:04:20.269 ************************************ 00:04:20.269 10:35:36 -- common/autotest_common.sh@1104 -- # odd_alloc 00:04:20.269 10:35:36 -- setup/hugepages.sh@159 -- # get_test_nr_hugepages 2098176 00:04:20.269 10:35:36 -- setup/hugepages.sh@49 -- # local size=2098176 00:04:20.269 10:35:36 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:20.269 10:35:36 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:20.269 10:35:36 -- setup/hugepages.sh@57 -- # nr_hugepages=1025 00:04:20.269 10:35:36 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:20.269 10:35:36 -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:20.269 10:35:36 -- setup/hugepages.sh@62 -- # local user_nodes 00:04:20.269 10:35:36 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1025 00:04:20.269 10:35:36 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:20.269 10:35:36 -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:20.269 10:35:36 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:20.269 10:35:36 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:20.269 10:35:36 -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:04:20.269 10:35:36 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:20.269 10:35:36 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:04:20.269 10:35:36 -- setup/hugepages.sh@83 -- # : 513 00:04:20.269 10:35:36 -- setup/hugepages.sh@84 -- # : 1 00:04:20.269 10:35:36 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:20.269 10:35:36 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=513 00:04:20.269 10:35:36 -- setup/hugepages.sh@83 -- # : 0 00:04:20.269 10:35:36 -- setup/hugepages.sh@84 -- # : 0 00:04:20.269 10:35:36 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:20.269 10:35:36 -- setup/hugepages.sh@160 -- # HUGEMEM=2049 00:04:20.269 10:35:36 -- setup/hugepages.sh@160 -- # HUGE_EVEN_ALLOC=yes 00:04:20.269 10:35:36 -- setup/hugepages.sh@160 -- # setup output 00:04:20.269 10:35:36 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:20.269 10:35:36 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:23.563 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:23.563 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:23.563 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:23.563 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:23.563 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:23.563 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:23.563 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:23.563 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:23.563 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:23.563 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:23.563 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:23.563 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:23.563 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:23.563 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:23.563 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:23.563 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:23.563 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:23.563 10:35:39 -- setup/hugepages.sh@161 -- # verify_nr_hugepages 00:04:23.563 10:35:39 -- setup/hugepages.sh@89 -- # local node 00:04:23.563 10:35:39 -- setup/hugepages.sh@90 -- # local sorted_t 00:04:23.563 10:35:39 -- setup/hugepages.sh@91 -- # local sorted_s 00:04:23.563 10:35:39 -- setup/hugepages.sh@92 -- # local surp 00:04:23.563 10:35:39 -- setup/hugepages.sh@93 -- # local resv 00:04:23.563 10:35:39 -- setup/hugepages.sh@94 -- # local anon 00:04:23.563 10:35:39 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:23.563 10:35:39 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:23.563 10:35:39 -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:23.563 10:35:39 -- setup/common.sh@18 -- # local node= 00:04:23.563 10:35:39 -- setup/common.sh@19 -- # local var val 00:04:23.563 10:35:39 -- setup/common.sh@20 -- # local mem_f mem 00:04:23.563 10:35:39 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:23.563 10:35:39 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:23.563 10:35:39 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:23.563 10:35:39 -- setup/common.sh@28 -- # mapfile -t mem 00:04:23.563 10:35:39 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:23.563 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.563 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.563 10:35:39 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 42682584 kB' 'MemAvailable: 45064680 kB' 'Buffers: 12536 kB' 'Cached: 11508856 kB' 'SwapCached: 16 kB' 'Active: 9756000 kB' 'Inactive: 2354388 kB' 'Active(anon): 9280648 kB' 'Inactive(anon): 57088 kB' 'Active(file): 475352 kB' 'Inactive(file): 2297300 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8387836 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 592188 kB' 'Mapped: 174856 kB' 'Shmem: 8748740 kB' 'KReclaimable: 245920 kB' 'Slab: 772856 kB' 'SReclaimable: 245920 kB' 'SUnreclaim: 526936 kB' 'KernelStack: 21824 kB' 'PageTables: 7960 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37486620 kB' 'Committed_AS: 10674788 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213476 kB' 'VmallocChunk: 0 kB' 'Percpu: 73472 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:04:23.563 10:35:39 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.563 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.563 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.563 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.563 10:35:39 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.563 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.563 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.563 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.563 10:35:39 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.563 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.563 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.563 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.563 10:35:39 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.563 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.563 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.563 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.563 10:35:39 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.563 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.563 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.563 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.563 10:35:39 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.563 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.563 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.563 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.563 10:35:39 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.563 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.563 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.563 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.564 10:35:39 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.564 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.564 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.564 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.564 10:35:39 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.564 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.564 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.564 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.564 10:35:39 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.564 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.564 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.564 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.564 10:35:39 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.564 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.564 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.564 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.564 10:35:39 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.564 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.564 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.564 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.564 10:35:39 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.564 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.564 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.564 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.564 10:35:39 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.564 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.564 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.564 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.564 10:35:39 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.564 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.564 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.564 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.564 10:35:39 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.564 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.564 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.564 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.564 10:35:39 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.564 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.564 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.564 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.564 10:35:39 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.564 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.564 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.564 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.564 10:35:39 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.564 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.564 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.564 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.564 10:35:39 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.564 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.564 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.564 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.564 10:35:39 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.564 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.564 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.564 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.564 10:35:39 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.564 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.564 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.564 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.564 10:35:39 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.564 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.564 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.564 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.564 10:35:39 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.564 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.564 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.564 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.564 10:35:39 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.564 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.564 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.564 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.564 10:35:39 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.564 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.564 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.564 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.564 10:35:39 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.564 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.564 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.564 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.564 10:35:39 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.564 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.564 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.564 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.564 10:35:39 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.564 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.564 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.564 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.564 10:35:39 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.564 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.564 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.564 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.564 10:35:39 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.564 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.564 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.564 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.564 10:35:39 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.564 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.564 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.564 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.564 10:35:39 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.564 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.564 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.564 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.564 10:35:39 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.564 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.564 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.564 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.564 10:35:39 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.564 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.564 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.564 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.564 10:35:39 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.564 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.564 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.564 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.564 10:35:39 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.564 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.564 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.564 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.564 10:35:39 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.564 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.564 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.564 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.564 10:35:39 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.564 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.564 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.564 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.564 10:35:39 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.564 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.564 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.564 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.564 10:35:39 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:23.564 10:35:39 -- setup/common.sh@33 -- # echo 0 00:04:23.564 10:35:39 -- setup/common.sh@33 -- # return 0 00:04:23.564 10:35:39 -- setup/hugepages.sh@97 -- # anon=0 00:04:23.564 10:35:39 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:23.564 10:35:39 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:23.564 10:35:39 -- setup/common.sh@18 -- # local node= 00:04:23.564 10:35:39 -- setup/common.sh@19 -- # local var val 00:04:23.564 10:35:39 -- setup/common.sh@20 -- # local mem_f mem 00:04:23.564 10:35:39 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:23.564 10:35:39 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:23.564 10:35:39 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:23.564 10:35:39 -- setup/common.sh@28 -- # mapfile -t mem 00:04:23.564 10:35:39 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:23.564 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.564 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.564 10:35:39 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 42689388 kB' 'MemAvailable: 45071484 kB' 'Buffers: 12536 kB' 'Cached: 11508856 kB' 'SwapCached: 16 kB' 'Active: 9757192 kB' 'Inactive: 2354388 kB' 'Active(anon): 9281840 kB' 'Inactive(anon): 57088 kB' 'Active(file): 475352 kB' 'Inactive(file): 2297300 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8387836 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 593380 kB' 'Mapped: 175232 kB' 'Shmem: 8748740 kB' 'KReclaimable: 245920 kB' 'Slab: 772856 kB' 'SReclaimable: 245920 kB' 'SUnreclaim: 526936 kB' 'KernelStack: 21824 kB' 'PageTables: 7976 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37486620 kB' 'Committed_AS: 10675868 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213460 kB' 'VmallocChunk: 0 kB' 'Percpu: 73472 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:04:23.564 10:35:39 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.564 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.564 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.564 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.564 10:35:39 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.564 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.564 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.564 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.565 10:35:39 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.565 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.565 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.565 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.565 10:35:39 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.565 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.565 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.565 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.565 10:35:39 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.565 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.565 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.565 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.565 10:35:39 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.565 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.565 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.565 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.565 10:35:39 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.565 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.565 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.565 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.565 10:35:39 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.565 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.565 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.565 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.565 10:35:39 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.565 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.565 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.565 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.565 10:35:39 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.565 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.565 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.565 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.565 10:35:39 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.565 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.565 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.565 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.565 10:35:39 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.565 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.565 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.565 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.565 10:35:39 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.565 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.565 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.565 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.565 10:35:39 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.565 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.565 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.565 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.565 10:35:39 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.565 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.565 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.565 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.565 10:35:39 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.565 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.565 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.565 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.565 10:35:39 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.565 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.565 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.565 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.565 10:35:39 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.565 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.565 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.565 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.565 10:35:39 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.565 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.565 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.565 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.565 10:35:39 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.565 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.565 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.565 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.565 10:35:39 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.565 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.565 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.565 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.565 10:35:39 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.565 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.565 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.565 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.565 10:35:39 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.565 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.565 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.565 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.565 10:35:39 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.565 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.565 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.565 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.565 10:35:39 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.565 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.565 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.565 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.565 10:35:39 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.565 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.565 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.565 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.565 10:35:39 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.565 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.565 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.565 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.565 10:35:39 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.565 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.565 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.565 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.565 10:35:39 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.565 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.565 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.565 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.565 10:35:39 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.565 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.565 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.565 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.565 10:35:39 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.565 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.565 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.565 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.565 10:35:39 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.565 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.565 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.565 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.565 10:35:39 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.565 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.565 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.565 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.565 10:35:39 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.565 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.565 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.565 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.565 10:35:39 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.565 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.565 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.565 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.565 10:35:39 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.565 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.565 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.565 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.565 10:35:39 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.565 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.565 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.565 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.565 10:35:39 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.565 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.565 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.565 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.565 10:35:39 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.565 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.565 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.565 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.565 10:35:39 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.565 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.565 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.565 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.565 10:35:39 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.565 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.565 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.565 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.565 10:35:39 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.565 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.565 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.565 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.565 10:35:39 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.565 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.565 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.565 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.565 10:35:39 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.565 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.565 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.565 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.565 10:35:39 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.565 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.565 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.565 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.565 10:35:39 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.565 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.565 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.566 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.566 10:35:39 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.566 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.566 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.566 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.566 10:35:39 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.566 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.566 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.566 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.566 10:35:39 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.566 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.566 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.566 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.566 10:35:39 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.566 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.566 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.566 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.566 10:35:39 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.566 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.566 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.566 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.566 10:35:39 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.566 10:35:39 -- setup/common.sh@33 -- # echo 0 00:04:23.566 10:35:39 -- setup/common.sh@33 -- # return 0 00:04:23.566 10:35:39 -- setup/hugepages.sh@99 -- # surp=0 00:04:23.566 10:35:39 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:23.566 10:35:39 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:23.566 10:35:39 -- setup/common.sh@18 -- # local node= 00:04:23.566 10:35:39 -- setup/common.sh@19 -- # local var val 00:04:23.566 10:35:39 -- setup/common.sh@20 -- # local mem_f mem 00:04:23.566 10:35:39 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:23.566 10:35:39 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:23.566 10:35:39 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:23.566 10:35:39 -- setup/common.sh@28 -- # mapfile -t mem 00:04:23.566 10:35:39 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:23.566 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.566 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.566 10:35:39 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 42690392 kB' 'MemAvailable: 45072488 kB' 'Buffers: 12536 kB' 'Cached: 11508880 kB' 'SwapCached: 16 kB' 'Active: 9753636 kB' 'Inactive: 2354388 kB' 'Active(anon): 9278284 kB' 'Inactive(anon): 57088 kB' 'Active(file): 475352 kB' 'Inactive(file): 2297300 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8387836 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 589788 kB' 'Mapped: 174832 kB' 'Shmem: 8748764 kB' 'KReclaimable: 245920 kB' 'Slab: 772876 kB' 'SReclaimable: 245920 kB' 'SUnreclaim: 526956 kB' 'KernelStack: 21840 kB' 'PageTables: 8052 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37486620 kB' 'Committed_AS: 10672308 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213412 kB' 'VmallocChunk: 0 kB' 'Percpu: 73472 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:04:23.566 10:35:39 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.566 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.566 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.566 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.566 10:35:39 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.566 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.566 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.566 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.566 10:35:39 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.566 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.566 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.566 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.566 10:35:39 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.566 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.566 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.566 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.566 10:35:39 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.566 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.566 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.566 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.566 10:35:39 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.566 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.566 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.566 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.566 10:35:39 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.566 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.566 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.566 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.566 10:35:39 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.566 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.566 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.566 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.566 10:35:39 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.566 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.566 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.566 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.566 10:35:39 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.566 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.566 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.566 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.566 10:35:39 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.566 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.566 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.566 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.566 10:35:39 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.566 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.566 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.566 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.566 10:35:39 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.566 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.566 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.566 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.566 10:35:39 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.566 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.566 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.566 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.566 10:35:39 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.566 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.566 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.566 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.566 10:35:39 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.566 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.566 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.566 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.566 10:35:39 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.566 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.566 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.566 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.566 10:35:39 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.566 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.566 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.566 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.566 10:35:39 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.566 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.566 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.566 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.566 10:35:39 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.566 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.566 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.566 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.566 10:35:39 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.566 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.566 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.566 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.566 10:35:39 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.566 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.566 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.566 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.566 10:35:39 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.566 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.566 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.566 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.566 10:35:39 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.566 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.566 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.566 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.566 10:35:39 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.566 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.566 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.566 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.566 10:35:39 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.566 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.566 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.566 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.566 10:35:39 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.566 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.566 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.566 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.566 10:35:39 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.566 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.566 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.566 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.566 10:35:39 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.566 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.566 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.566 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.567 10:35:39 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.567 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.567 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.567 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.567 10:35:39 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.567 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.567 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.567 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.567 10:35:39 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.567 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.567 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.567 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.567 10:35:39 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.567 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.567 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.567 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.567 10:35:39 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.567 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.567 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.567 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.567 10:35:39 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.567 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.567 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.567 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.567 10:35:39 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.567 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.567 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.567 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.567 10:35:39 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.567 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.567 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.567 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.567 10:35:39 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.567 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.567 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.567 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.567 10:35:39 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.567 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.567 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.567 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.567 10:35:39 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.567 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.567 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.567 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.567 10:35:39 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.567 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.567 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.567 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.567 10:35:39 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.567 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.567 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.567 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.567 10:35:39 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.567 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.829 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.829 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.829 10:35:39 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.829 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.829 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.829 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.829 10:35:39 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.829 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.829 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.829 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.829 10:35:39 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.829 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.829 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.829 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.829 10:35:39 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.829 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.829 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.829 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.829 10:35:39 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.829 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.829 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.829 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.829 10:35:39 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.829 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.829 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.829 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.829 10:35:39 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.829 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.829 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.829 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.829 10:35:39 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:23.829 10:35:39 -- setup/common.sh@33 -- # echo 0 00:04:23.829 10:35:39 -- setup/common.sh@33 -- # return 0 00:04:23.829 10:35:39 -- setup/hugepages.sh@100 -- # resv=0 00:04:23.829 10:35:39 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1025 00:04:23.829 nr_hugepages=1025 00:04:23.829 10:35:39 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:23.829 resv_hugepages=0 00:04:23.829 10:35:39 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:23.829 surplus_hugepages=0 00:04:23.829 10:35:39 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:23.829 anon_hugepages=0 00:04:23.829 10:35:39 -- setup/hugepages.sh@107 -- # (( 1025 == nr_hugepages + surp + resv )) 00:04:23.829 10:35:39 -- setup/hugepages.sh@109 -- # (( 1025 == nr_hugepages )) 00:04:23.829 10:35:39 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:23.829 10:35:39 -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:23.829 10:35:39 -- setup/common.sh@18 -- # local node= 00:04:23.829 10:35:39 -- setup/common.sh@19 -- # local var val 00:04:23.829 10:35:39 -- setup/common.sh@20 -- # local mem_f mem 00:04:23.829 10:35:39 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:23.829 10:35:39 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:23.829 10:35:39 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:23.829 10:35:39 -- setup/common.sh@28 -- # mapfile -t mem 00:04:23.829 10:35:39 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:23.829 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.829 10:35:39 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 42689896 kB' 'MemAvailable: 45071992 kB' 'Buffers: 12536 kB' 'Cached: 11508884 kB' 'SwapCached: 16 kB' 'Active: 9756764 kB' 'Inactive: 2354388 kB' 'Active(anon): 9281412 kB' 'Inactive(anon): 57088 kB' 'Active(file): 475352 kB' 'Inactive(file): 2297300 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8387836 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 592872 kB' 'Mapped: 174892 kB' 'Shmem: 8748768 kB' 'KReclaimable: 245920 kB' 'Slab: 772868 kB' 'SReclaimable: 245920 kB' 'SUnreclaim: 526948 kB' 'KernelStack: 21808 kB' 'PageTables: 7932 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37486620 kB' 'Committed_AS: 10675896 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213400 kB' 'VmallocChunk: 0 kB' 'Percpu: 73472 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:04:23.829 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.829 10:35:39 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.829 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.829 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.829 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.829 10:35:39 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.829 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.829 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.829 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.829 10:35:39 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.829 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.829 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.829 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.829 10:35:39 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.829 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.829 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.829 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.829 10:35:39 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.829 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.829 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.829 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.829 10:35:39 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.829 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.829 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.829 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.829 10:35:39 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.829 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.829 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.829 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.829 10:35:39 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.829 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.829 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.829 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.829 10:35:39 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.829 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.829 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.829 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.829 10:35:39 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.829 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.829 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.829 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.829 10:35:39 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.829 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.829 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.829 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.829 10:35:39 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.829 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.829 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.829 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.829 10:35:39 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.829 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.829 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.829 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.829 10:35:39 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.829 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.829 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.829 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.829 10:35:39 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.829 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.829 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.829 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.829 10:35:39 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.829 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.829 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.829 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.829 10:35:39 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.829 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.829 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.829 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.829 10:35:39 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.829 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.829 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.829 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.829 10:35:39 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.829 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.829 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.829 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.829 10:35:39 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.829 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.829 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.829 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.829 10:35:39 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.829 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.829 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.829 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.829 10:35:39 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.829 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.829 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.829 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.829 10:35:39 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.829 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.829 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.829 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.829 10:35:39 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.829 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.829 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.830 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.830 10:35:39 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.830 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.830 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.830 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.830 10:35:39 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.830 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.830 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.830 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.830 10:35:39 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.830 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.830 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.830 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.830 10:35:39 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.830 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.830 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.830 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.830 10:35:39 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.830 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.830 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.830 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.830 10:35:39 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.830 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.830 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.830 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.830 10:35:39 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.830 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.830 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.830 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.830 10:35:39 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.830 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.830 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.830 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.830 10:35:39 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.830 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.830 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.830 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.830 10:35:39 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.830 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.830 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.830 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.830 10:35:39 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.830 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.830 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.830 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.830 10:35:39 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.830 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.830 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.830 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.830 10:35:39 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.830 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.830 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.830 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.830 10:35:39 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.830 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.830 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.830 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.830 10:35:39 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.830 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.830 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.830 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.830 10:35:39 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.830 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.830 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.830 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.830 10:35:39 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.830 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.830 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.830 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.830 10:35:39 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.830 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.830 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.830 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.830 10:35:39 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.830 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.830 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.830 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.830 10:35:39 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.830 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.830 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.830 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.830 10:35:39 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.830 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.830 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.830 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.830 10:35:39 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.830 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.830 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.830 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.830 10:35:39 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.830 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.830 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.830 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.830 10:35:39 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.830 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.830 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.830 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.830 10:35:39 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:23.830 10:35:39 -- setup/common.sh@33 -- # echo 1025 00:04:23.830 10:35:39 -- setup/common.sh@33 -- # return 0 00:04:23.830 10:35:39 -- setup/hugepages.sh@110 -- # (( 1025 == nr_hugepages + surp + resv )) 00:04:23.830 10:35:39 -- setup/hugepages.sh@112 -- # get_nodes 00:04:23.830 10:35:39 -- setup/hugepages.sh@27 -- # local node 00:04:23.830 10:35:39 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:23.830 10:35:39 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:23.830 10:35:39 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:23.830 10:35:39 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=513 00:04:23.830 10:35:39 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:23.830 10:35:39 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:23.830 10:35:39 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:23.830 10:35:39 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:23.830 10:35:39 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:23.830 10:35:39 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:23.830 10:35:39 -- setup/common.sh@18 -- # local node=0 00:04:23.830 10:35:39 -- setup/common.sh@19 -- # local var val 00:04:23.830 10:35:39 -- setup/common.sh@20 -- # local mem_f mem 00:04:23.830 10:35:39 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:23.830 10:35:39 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:23.830 10:35:39 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:23.830 10:35:39 -- setup/common.sh@28 -- # mapfile -t mem 00:04:23.830 10:35:39 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:23.830 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.830 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.830 10:35:39 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32592084 kB' 'MemFree: 26517516 kB' 'MemUsed: 6074568 kB' 'SwapCached: 16 kB' 'Active: 3390580 kB' 'Inactive: 180704 kB' 'Active(anon): 3173960 kB' 'Inactive(anon): 16 kB' 'Active(file): 216620 kB' 'Inactive(file): 180688 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3353968 kB' 'Mapped: 107020 kB' 'AnonPages: 220420 kB' 'Shmem: 2956644 kB' 'KernelStack: 12264 kB' 'PageTables: 3896 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 124976 kB' 'Slab: 365860 kB' 'SReclaimable: 124976 kB' 'SUnreclaim: 240884 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:23.830 10:35:39 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.830 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.830 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.830 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.830 10:35:39 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.830 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.830 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.830 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.830 10:35:39 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.830 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.830 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.830 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.830 10:35:39 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.830 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.830 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.830 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.830 10:35:39 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.830 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.830 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.830 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.830 10:35:39 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.830 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.830 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.830 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.830 10:35:39 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.830 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.830 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.830 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.830 10:35:39 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.830 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.830 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.830 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.830 10:35:39 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.830 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.830 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.831 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.831 10:35:39 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.831 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.831 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.831 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.831 10:35:39 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.831 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.831 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.831 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.831 10:35:39 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.831 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.831 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.831 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.831 10:35:39 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.831 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.831 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.831 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.831 10:35:39 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.831 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.831 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.831 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.831 10:35:39 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.831 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.831 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.831 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.831 10:35:39 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.831 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.831 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.831 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.831 10:35:39 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.831 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.831 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.831 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.831 10:35:39 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.831 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.831 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.831 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.831 10:35:39 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.831 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.831 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.831 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.831 10:35:39 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.831 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.831 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.831 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.831 10:35:39 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.831 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.831 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.831 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.831 10:35:39 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.831 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.831 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.831 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.831 10:35:39 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.831 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.831 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.831 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.831 10:35:39 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.831 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.831 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.831 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.831 10:35:39 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.831 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.831 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.831 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.831 10:35:39 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.831 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.831 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.831 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.831 10:35:39 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.831 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.831 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.831 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.831 10:35:39 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.831 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.831 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.831 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.831 10:35:39 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.831 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.831 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.831 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.831 10:35:39 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.831 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.831 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.831 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.831 10:35:39 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.831 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.831 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.831 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.831 10:35:39 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.831 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.831 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.831 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.831 10:35:39 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.831 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.831 10:35:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.831 10:35:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.831 10:35:39 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.831 10:35:39 -- setup/common.sh@32 -- # continue 00:04:23.831 10:35:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.831 10:35:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.831 10:35:40 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.831 10:35:40 -- setup/common.sh@32 -- # continue 00:04:23.831 10:35:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.831 10:35:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.831 10:35:40 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.831 10:35:40 -- setup/common.sh@32 -- # continue 00:04:23.831 10:35:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.831 10:35:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.831 10:35:40 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.831 10:35:40 -- setup/common.sh@33 -- # echo 0 00:04:23.831 10:35:40 -- setup/common.sh@33 -- # return 0 00:04:23.831 10:35:40 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:23.831 10:35:40 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:23.831 10:35:40 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:23.831 10:35:40 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:23.831 10:35:40 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:23.831 10:35:40 -- setup/common.sh@18 -- # local node=1 00:04:23.831 10:35:40 -- setup/common.sh@19 -- # local var val 00:04:23.831 10:35:40 -- setup/common.sh@20 -- # local mem_f mem 00:04:23.831 10:35:40 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:23.831 10:35:40 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:23.831 10:35:40 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:23.831 10:35:40 -- setup/common.sh@28 -- # mapfile -t mem 00:04:23.831 10:35:40 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:23.831 10:35:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.831 10:35:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.831 10:35:40 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27703148 kB' 'MemFree: 16180044 kB' 'MemUsed: 11523104 kB' 'SwapCached: 0 kB' 'Active: 6360644 kB' 'Inactive: 2173684 kB' 'Active(anon): 6101912 kB' 'Inactive(anon): 57072 kB' 'Active(file): 258732 kB' 'Inactive(file): 2116612 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8167484 kB' 'Mapped: 67308 kB' 'AnonPages: 367044 kB' 'Shmem: 5792140 kB' 'KernelStack: 9560 kB' 'PageTables: 4084 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 120944 kB' 'Slab: 407000 kB' 'SReclaimable: 120944 kB' 'SUnreclaim: 286056 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 513' 'HugePages_Free: 513' 'HugePages_Surp: 0' 00:04:23.831 10:35:40 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.831 10:35:40 -- setup/common.sh@32 -- # continue 00:04:23.831 10:35:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.831 10:35:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.831 10:35:40 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.831 10:35:40 -- setup/common.sh@32 -- # continue 00:04:23.831 10:35:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.831 10:35:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.831 10:35:40 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.831 10:35:40 -- setup/common.sh@32 -- # continue 00:04:23.831 10:35:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.831 10:35:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.831 10:35:40 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.831 10:35:40 -- setup/common.sh@32 -- # continue 00:04:23.831 10:35:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.831 10:35:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.831 10:35:40 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.831 10:35:40 -- setup/common.sh@32 -- # continue 00:04:23.831 10:35:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.831 10:35:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.831 10:35:40 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.831 10:35:40 -- setup/common.sh@32 -- # continue 00:04:23.831 10:35:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.831 10:35:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.831 10:35:40 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.831 10:35:40 -- setup/common.sh@32 -- # continue 00:04:23.831 10:35:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.831 10:35:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.831 10:35:40 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.831 10:35:40 -- setup/common.sh@32 -- # continue 00:04:23.831 10:35:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.831 10:35:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.831 10:35:40 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.831 10:35:40 -- setup/common.sh@32 -- # continue 00:04:23.832 10:35:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.832 10:35:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.832 10:35:40 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.832 10:35:40 -- setup/common.sh@32 -- # continue 00:04:23.832 10:35:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.832 10:35:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.832 10:35:40 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.832 10:35:40 -- setup/common.sh@32 -- # continue 00:04:23.832 10:35:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.832 10:35:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.832 10:35:40 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.832 10:35:40 -- setup/common.sh@32 -- # continue 00:04:23.832 10:35:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.832 10:35:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.832 10:35:40 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.832 10:35:40 -- setup/common.sh@32 -- # continue 00:04:23.832 10:35:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.832 10:35:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.832 10:35:40 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.832 10:35:40 -- setup/common.sh@32 -- # continue 00:04:23.832 10:35:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.832 10:35:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.832 10:35:40 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.832 10:35:40 -- setup/common.sh@32 -- # continue 00:04:23.832 10:35:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.832 10:35:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.832 10:35:40 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.832 10:35:40 -- setup/common.sh@32 -- # continue 00:04:23.832 10:35:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.832 10:35:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.832 10:35:40 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.832 10:35:40 -- setup/common.sh@32 -- # continue 00:04:23.832 10:35:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.832 10:35:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.832 10:35:40 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.832 10:35:40 -- setup/common.sh@32 -- # continue 00:04:23.832 10:35:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.832 10:35:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.832 10:35:40 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.832 10:35:40 -- setup/common.sh@32 -- # continue 00:04:23.832 10:35:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.832 10:35:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.832 10:35:40 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.832 10:35:40 -- setup/common.sh@32 -- # continue 00:04:23.832 10:35:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.832 10:35:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.832 10:35:40 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.832 10:35:40 -- setup/common.sh@32 -- # continue 00:04:23.832 10:35:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.832 10:35:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.832 10:35:40 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.832 10:35:40 -- setup/common.sh@32 -- # continue 00:04:23.832 10:35:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.832 10:35:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.832 10:35:40 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.832 10:35:40 -- setup/common.sh@32 -- # continue 00:04:23.832 10:35:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.832 10:35:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.832 10:35:40 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.832 10:35:40 -- setup/common.sh@32 -- # continue 00:04:23.832 10:35:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.832 10:35:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.832 10:35:40 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.832 10:35:40 -- setup/common.sh@32 -- # continue 00:04:23.832 10:35:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.832 10:35:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.832 10:35:40 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.832 10:35:40 -- setup/common.sh@32 -- # continue 00:04:23.832 10:35:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.832 10:35:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.832 10:35:40 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.832 10:35:40 -- setup/common.sh@32 -- # continue 00:04:23.832 10:35:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.832 10:35:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.832 10:35:40 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.832 10:35:40 -- setup/common.sh@32 -- # continue 00:04:23.832 10:35:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.832 10:35:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.832 10:35:40 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.832 10:35:40 -- setup/common.sh@32 -- # continue 00:04:23.832 10:35:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.832 10:35:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.832 10:35:40 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.832 10:35:40 -- setup/common.sh@32 -- # continue 00:04:23.832 10:35:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.832 10:35:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.832 10:35:40 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.832 10:35:40 -- setup/common.sh@32 -- # continue 00:04:23.832 10:35:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.832 10:35:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.832 10:35:40 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.832 10:35:40 -- setup/common.sh@32 -- # continue 00:04:23.832 10:35:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.832 10:35:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.832 10:35:40 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.832 10:35:40 -- setup/common.sh@32 -- # continue 00:04:23.832 10:35:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.832 10:35:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.832 10:35:40 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.832 10:35:40 -- setup/common.sh@32 -- # continue 00:04:23.832 10:35:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.832 10:35:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.832 10:35:40 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.832 10:35:40 -- setup/common.sh@32 -- # continue 00:04:23.832 10:35:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.832 10:35:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.832 10:35:40 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.832 10:35:40 -- setup/common.sh@32 -- # continue 00:04:23.832 10:35:40 -- setup/common.sh@31 -- # IFS=': ' 00:04:23.832 10:35:40 -- setup/common.sh@31 -- # read -r var val _ 00:04:23.832 10:35:40 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:23.832 10:35:40 -- setup/common.sh@33 -- # echo 0 00:04:23.832 10:35:40 -- setup/common.sh@33 -- # return 0 00:04:23.832 10:35:40 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:23.832 10:35:40 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:23.832 10:35:40 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:23.832 10:35:40 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:23.832 10:35:40 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 513' 00:04:23.832 node0=512 expecting 513 00:04:23.832 10:35:40 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:23.832 10:35:40 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:23.832 10:35:40 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:23.832 10:35:40 -- setup/hugepages.sh@128 -- # echo 'node1=513 expecting 512' 00:04:23.832 node1=513 expecting 512 00:04:23.832 10:35:40 -- setup/hugepages.sh@130 -- # [[ 512 513 == \5\1\2\ \5\1\3 ]] 00:04:23.832 00:04:23.832 real 0m3.421s 00:04:23.832 user 0m1.295s 00:04:23.832 sys 0m2.174s 00:04:23.832 10:35:40 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:23.832 10:35:40 -- common/autotest_common.sh@10 -- # set +x 00:04:23.832 ************************************ 00:04:23.832 END TEST odd_alloc 00:04:23.832 ************************************ 00:04:23.832 10:35:40 -- setup/hugepages.sh@214 -- # run_test custom_alloc custom_alloc 00:04:23.832 10:35:40 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:23.832 10:35:40 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:23.832 10:35:40 -- common/autotest_common.sh@10 -- # set +x 00:04:23.832 ************************************ 00:04:23.832 START TEST custom_alloc 00:04:23.832 ************************************ 00:04:23.832 10:35:40 -- common/autotest_common.sh@1104 -- # custom_alloc 00:04:23.832 10:35:40 -- setup/hugepages.sh@167 -- # local IFS=, 00:04:23.832 10:35:40 -- setup/hugepages.sh@169 -- # local node 00:04:23.832 10:35:40 -- setup/hugepages.sh@170 -- # nodes_hp=() 00:04:23.832 10:35:40 -- setup/hugepages.sh@170 -- # local nodes_hp 00:04:23.832 10:35:40 -- setup/hugepages.sh@172 -- # local nr_hugepages=0 _nr_hugepages=0 00:04:23.832 10:35:40 -- setup/hugepages.sh@174 -- # get_test_nr_hugepages 1048576 00:04:23.832 10:35:40 -- setup/hugepages.sh@49 -- # local size=1048576 00:04:23.832 10:35:40 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:23.832 10:35:40 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:23.832 10:35:40 -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:04:23.832 10:35:40 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:23.832 10:35:40 -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:23.832 10:35:40 -- setup/hugepages.sh@62 -- # local user_nodes 00:04:23.832 10:35:40 -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:04:23.832 10:35:40 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:23.832 10:35:40 -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:23.832 10:35:40 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:23.832 10:35:40 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:23.832 10:35:40 -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:04:23.832 10:35:40 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:23.832 10:35:40 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:04:23.832 10:35:40 -- setup/hugepages.sh@83 -- # : 256 00:04:23.832 10:35:40 -- setup/hugepages.sh@84 -- # : 1 00:04:23.832 10:35:40 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:23.832 10:35:40 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:04:23.832 10:35:40 -- setup/hugepages.sh@83 -- # : 0 00:04:23.832 10:35:40 -- setup/hugepages.sh@84 -- # : 0 00:04:23.832 10:35:40 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:23.832 10:35:40 -- setup/hugepages.sh@175 -- # nodes_hp[0]=512 00:04:23.832 10:35:40 -- setup/hugepages.sh@176 -- # (( 2 > 1 )) 00:04:23.832 10:35:40 -- setup/hugepages.sh@177 -- # get_test_nr_hugepages 2097152 00:04:23.832 10:35:40 -- setup/hugepages.sh@49 -- # local size=2097152 00:04:23.832 10:35:40 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:23.832 10:35:40 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:23.832 10:35:40 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:23.833 10:35:40 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:23.833 10:35:40 -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:23.833 10:35:40 -- setup/hugepages.sh@62 -- # local user_nodes 00:04:23.833 10:35:40 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:23.833 10:35:40 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:23.833 10:35:40 -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:23.833 10:35:40 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:23.833 10:35:40 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:23.833 10:35:40 -- setup/hugepages.sh@74 -- # (( 1 > 0 )) 00:04:23.833 10:35:40 -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:04:23.833 10:35:40 -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:04:23.833 10:35:40 -- setup/hugepages.sh@78 -- # return 0 00:04:23.833 10:35:40 -- setup/hugepages.sh@178 -- # nodes_hp[1]=1024 00:04:23.833 10:35:40 -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:04:23.833 10:35:40 -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:04:23.833 10:35:40 -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:04:23.833 10:35:40 -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:04:23.833 10:35:40 -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:04:23.833 10:35:40 -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:04:23.833 10:35:40 -- setup/hugepages.sh@186 -- # get_test_nr_hugepages_per_node 00:04:23.833 10:35:40 -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:23.833 10:35:40 -- setup/hugepages.sh@62 -- # local user_nodes 00:04:23.833 10:35:40 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:23.833 10:35:40 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:23.833 10:35:40 -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:23.833 10:35:40 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:23.833 10:35:40 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:23.833 10:35:40 -- setup/hugepages.sh@74 -- # (( 2 > 0 )) 00:04:23.833 10:35:40 -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:04:23.833 10:35:40 -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:04:23.833 10:35:40 -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:04:23.833 10:35:40 -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=1024 00:04:23.833 10:35:40 -- setup/hugepages.sh@78 -- # return 0 00:04:23.833 10:35:40 -- setup/hugepages.sh@187 -- # HUGENODE='nodes_hp[0]=512,nodes_hp[1]=1024' 00:04:23.833 10:35:40 -- setup/hugepages.sh@187 -- # setup output 00:04:23.833 10:35:40 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:23.833 10:35:40 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:27.155 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:27.155 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:27.155 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:27.156 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:27.156 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:27.156 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:27.156 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:27.156 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:27.156 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:27.156 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:27.156 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:27.156 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:27.156 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:27.156 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:27.156 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:27.156 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:27.156 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:27.156 10:35:43 -- setup/hugepages.sh@188 -- # nr_hugepages=1536 00:04:27.156 10:35:43 -- setup/hugepages.sh@188 -- # verify_nr_hugepages 00:04:27.156 10:35:43 -- setup/hugepages.sh@89 -- # local node 00:04:27.156 10:35:43 -- setup/hugepages.sh@90 -- # local sorted_t 00:04:27.156 10:35:43 -- setup/hugepages.sh@91 -- # local sorted_s 00:04:27.156 10:35:43 -- setup/hugepages.sh@92 -- # local surp 00:04:27.156 10:35:43 -- setup/hugepages.sh@93 -- # local resv 00:04:27.156 10:35:43 -- setup/hugepages.sh@94 -- # local anon 00:04:27.156 10:35:43 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:27.156 10:35:43 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:27.156 10:35:43 -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:27.156 10:35:43 -- setup/common.sh@18 -- # local node= 00:04:27.156 10:35:43 -- setup/common.sh@19 -- # local var val 00:04:27.156 10:35:43 -- setup/common.sh@20 -- # local mem_f mem 00:04:27.156 10:35:43 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:27.156 10:35:43 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:27.156 10:35:43 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:27.156 10:35:43 -- setup/common.sh@28 -- # mapfile -t mem 00:04:27.156 10:35:43 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:27.156 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.156 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.156 10:35:43 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 41683232 kB' 'MemAvailable: 44065328 kB' 'Buffers: 12536 kB' 'Cached: 11508992 kB' 'SwapCached: 16 kB' 'Active: 9750056 kB' 'Inactive: 2354388 kB' 'Active(anon): 9274704 kB' 'Inactive(anon): 57088 kB' 'Active(file): 475352 kB' 'Inactive(file): 2297300 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8387836 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 586192 kB' 'Mapped: 174056 kB' 'Shmem: 8748876 kB' 'KReclaimable: 245920 kB' 'Slab: 772296 kB' 'SReclaimable: 245920 kB' 'SUnreclaim: 526376 kB' 'KernelStack: 21856 kB' 'PageTables: 8044 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36963356 kB' 'Committed_AS: 10670516 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213428 kB' 'VmallocChunk: 0 kB' 'Percpu: 73472 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:04:27.156 10:35:43 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.156 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.156 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.156 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.156 10:35:43 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.156 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.156 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.156 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.156 10:35:43 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.156 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.156 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.156 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.156 10:35:43 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.156 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.156 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.156 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.156 10:35:43 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.156 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.156 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.156 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.156 10:35:43 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.156 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.156 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.156 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.156 10:35:43 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.156 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.156 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.156 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.156 10:35:43 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.156 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.156 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.156 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.156 10:35:43 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.156 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.156 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.156 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.156 10:35:43 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.156 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.156 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.156 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.156 10:35:43 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.156 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.156 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.156 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.156 10:35:43 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.156 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.156 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.156 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.156 10:35:43 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.156 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.156 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.156 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.156 10:35:43 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.156 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.156 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.156 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.156 10:35:43 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.156 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.156 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.156 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.156 10:35:43 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.156 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.156 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.156 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.156 10:35:43 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.156 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.156 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.156 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.156 10:35:43 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.156 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.156 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.156 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.156 10:35:43 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.156 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.156 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.156 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.156 10:35:43 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.156 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.156 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.156 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.156 10:35:43 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.156 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.156 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.156 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.156 10:35:43 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.156 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.156 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.156 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.156 10:35:43 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.156 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.156 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.156 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.156 10:35:43 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.156 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.156 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.156 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.156 10:35:43 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.156 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.156 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.156 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.156 10:35:43 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.156 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.156 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.156 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.156 10:35:43 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.156 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.156 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.156 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.156 10:35:43 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.156 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.156 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.156 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.156 10:35:43 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.156 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.156 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.156 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.156 10:35:43 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.156 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.156 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.156 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.157 10:35:43 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.157 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.157 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.157 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.157 10:35:43 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.157 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.157 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.157 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.157 10:35:43 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.157 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.157 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.157 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.157 10:35:43 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.157 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.157 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.157 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.157 10:35:43 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.157 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.157 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.157 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.157 10:35:43 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.157 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.157 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.157 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.157 10:35:43 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.157 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.157 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.157 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.157 10:35:43 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.157 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.157 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.157 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.157 10:35:43 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.157 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.157 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.157 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.157 10:35:43 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.157 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.157 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.157 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.157 10:35:43 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:27.157 10:35:43 -- setup/common.sh@33 -- # echo 0 00:04:27.157 10:35:43 -- setup/common.sh@33 -- # return 0 00:04:27.157 10:35:43 -- setup/hugepages.sh@97 -- # anon=0 00:04:27.157 10:35:43 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:27.157 10:35:43 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:27.157 10:35:43 -- setup/common.sh@18 -- # local node= 00:04:27.157 10:35:43 -- setup/common.sh@19 -- # local var val 00:04:27.157 10:35:43 -- setup/common.sh@20 -- # local mem_f mem 00:04:27.157 10:35:43 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:27.157 10:35:43 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:27.157 10:35:43 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:27.157 10:35:43 -- setup/common.sh@28 -- # mapfile -t mem 00:04:27.157 10:35:43 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:27.157 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.157 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.157 10:35:43 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 41685772 kB' 'MemAvailable: 44067868 kB' 'Buffers: 12536 kB' 'Cached: 11508996 kB' 'SwapCached: 16 kB' 'Active: 9749768 kB' 'Inactive: 2354388 kB' 'Active(anon): 9274416 kB' 'Inactive(anon): 57088 kB' 'Active(file): 475352 kB' 'Inactive(file): 2297300 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8387836 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 585860 kB' 'Mapped: 174332 kB' 'Shmem: 8748880 kB' 'KReclaimable: 245920 kB' 'Slab: 772336 kB' 'SReclaimable: 245920 kB' 'SUnreclaim: 526416 kB' 'KernelStack: 21824 kB' 'PageTables: 7984 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36963356 kB' 'Committed_AS: 10670528 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213412 kB' 'VmallocChunk: 0 kB' 'Percpu: 73472 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:04:27.157 10:35:43 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.157 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.157 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.157 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.157 10:35:43 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.157 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.157 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.157 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.157 10:35:43 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.157 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.157 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.157 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.157 10:35:43 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.157 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.157 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.157 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.157 10:35:43 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.157 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.157 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.157 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.157 10:35:43 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.157 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.157 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.157 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.157 10:35:43 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.157 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.157 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.157 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.157 10:35:43 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.157 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.157 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.157 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.157 10:35:43 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.157 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.157 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.157 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.157 10:35:43 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.157 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.157 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.157 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.157 10:35:43 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.157 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.157 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.157 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.157 10:35:43 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.157 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.157 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.157 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.157 10:35:43 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.157 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.157 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.157 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.157 10:35:43 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.157 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.157 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.157 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.157 10:35:43 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.157 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.157 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.157 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.157 10:35:43 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.157 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.157 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.157 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.157 10:35:43 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.157 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.157 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.157 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.157 10:35:43 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.157 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.157 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.157 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.157 10:35:43 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.157 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.157 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.157 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.157 10:35:43 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.157 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.157 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.157 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.157 10:35:43 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.157 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.157 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.157 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.157 10:35:43 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.157 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.157 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.157 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.157 10:35:43 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.157 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.157 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.157 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.157 10:35:43 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.157 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.157 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.157 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.157 10:35:43 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.157 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.157 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.157 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.158 10:35:43 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.158 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.158 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.158 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.158 10:35:43 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.158 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.158 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.158 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.158 10:35:43 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.158 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.158 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.158 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.158 10:35:43 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.158 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.158 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.158 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.158 10:35:43 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.158 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.158 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.158 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.158 10:35:43 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.158 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.158 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.158 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.158 10:35:43 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.158 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.158 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.158 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.158 10:35:43 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.158 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.158 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.158 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.158 10:35:43 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.158 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.158 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.158 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.158 10:35:43 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.158 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.158 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.158 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.158 10:35:43 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.158 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.158 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.158 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.158 10:35:43 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.158 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.158 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.158 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.158 10:35:43 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.158 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.158 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.158 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.158 10:35:43 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.158 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.158 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.158 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.158 10:35:43 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.158 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.158 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.158 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.158 10:35:43 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.158 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.158 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.158 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.158 10:35:43 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.158 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.158 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.158 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.158 10:35:43 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.158 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.158 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.158 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.158 10:35:43 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.158 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.158 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.158 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.158 10:35:43 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.158 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.158 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.158 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.158 10:35:43 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.158 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.158 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.158 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.158 10:35:43 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.158 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.158 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.158 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.158 10:35:43 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.158 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.158 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.158 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.158 10:35:43 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.158 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.158 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.158 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.158 10:35:43 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.158 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.158 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.158 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.158 10:35:43 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.158 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.158 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.158 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.158 10:35:43 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.158 10:35:43 -- setup/common.sh@33 -- # echo 0 00:04:27.158 10:35:43 -- setup/common.sh@33 -- # return 0 00:04:27.158 10:35:43 -- setup/hugepages.sh@99 -- # surp=0 00:04:27.158 10:35:43 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:27.158 10:35:43 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:27.158 10:35:43 -- setup/common.sh@18 -- # local node= 00:04:27.158 10:35:43 -- setup/common.sh@19 -- # local var val 00:04:27.158 10:35:43 -- setup/common.sh@20 -- # local mem_f mem 00:04:27.158 10:35:43 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:27.158 10:35:43 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:27.158 10:35:43 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:27.158 10:35:43 -- setup/common.sh@28 -- # mapfile -t mem 00:04:27.158 10:35:43 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:27.158 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.158 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.158 10:35:43 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 41685548 kB' 'MemAvailable: 44067644 kB' 'Buffers: 12536 kB' 'Cached: 11509008 kB' 'SwapCached: 16 kB' 'Active: 9749496 kB' 'Inactive: 2354388 kB' 'Active(anon): 9274144 kB' 'Inactive(anon): 57088 kB' 'Active(file): 475352 kB' 'Inactive(file): 2297300 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8387836 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 585556 kB' 'Mapped: 174332 kB' 'Shmem: 8748892 kB' 'KReclaimable: 245920 kB' 'Slab: 772336 kB' 'SReclaimable: 245920 kB' 'SUnreclaim: 526416 kB' 'KernelStack: 21824 kB' 'PageTables: 7984 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36963356 kB' 'Committed_AS: 10670544 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213412 kB' 'VmallocChunk: 0 kB' 'Percpu: 73472 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:04:27.158 10:35:43 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.158 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.158 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.158 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.158 10:35:43 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.158 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.158 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.158 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.158 10:35:43 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.158 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.158 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.158 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.158 10:35:43 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.158 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.158 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.158 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.158 10:35:43 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.158 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.158 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.158 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.158 10:35:43 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.158 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.158 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.158 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.158 10:35:43 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.158 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.158 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.158 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.158 10:35:43 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.158 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.158 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.158 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.158 10:35:43 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.158 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.158 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.158 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.159 10:35:43 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.159 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.159 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.159 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.159 10:35:43 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.159 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.159 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.159 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.159 10:35:43 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.159 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.159 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.159 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.159 10:35:43 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.159 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.159 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.159 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.159 10:35:43 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.159 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.159 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.159 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.159 10:35:43 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.159 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.159 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.159 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.159 10:35:43 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.159 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.159 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.159 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.159 10:35:43 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.159 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.159 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.159 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.159 10:35:43 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.159 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.159 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.159 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.159 10:35:43 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.159 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.159 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.159 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.159 10:35:43 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.159 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.159 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.159 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.159 10:35:43 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.159 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.159 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.159 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.159 10:35:43 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.159 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.159 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.159 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.159 10:35:43 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.159 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.159 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.159 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.159 10:35:43 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.159 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.159 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.159 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.159 10:35:43 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.159 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.159 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.159 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.159 10:35:43 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.159 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.159 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.159 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.159 10:35:43 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.159 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.159 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.159 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.159 10:35:43 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.159 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.159 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.159 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.159 10:35:43 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.159 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.159 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.159 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.159 10:35:43 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.159 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.159 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.159 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.159 10:35:43 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.159 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.159 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.159 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.159 10:35:43 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.159 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.159 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.159 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.159 10:35:43 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.159 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.159 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.159 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.159 10:35:43 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.159 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.159 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.159 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.159 10:35:43 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.159 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.159 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.159 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.159 10:35:43 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.159 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.159 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.159 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.159 10:35:43 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.159 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.159 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.159 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.159 10:35:43 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.159 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.159 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.159 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.159 10:35:43 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.159 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.159 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.159 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.159 10:35:43 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.159 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.159 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.159 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.159 10:35:43 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.159 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.159 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.159 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.159 10:35:43 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.159 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.159 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.159 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.159 10:35:43 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.159 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.159 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.159 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.159 10:35:43 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.159 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.159 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.159 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.159 10:35:43 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.159 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.159 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.159 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.159 10:35:43 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.159 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.159 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.159 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.159 10:35:43 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.159 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.159 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.159 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.159 10:35:43 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.159 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.159 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.159 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.159 10:35:43 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.159 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.159 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.159 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.159 10:35:43 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.159 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.159 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.159 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.159 10:35:43 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:27.159 10:35:43 -- setup/common.sh@33 -- # echo 0 00:04:27.159 10:35:43 -- setup/common.sh@33 -- # return 0 00:04:27.159 10:35:43 -- setup/hugepages.sh@100 -- # resv=0 00:04:27.159 10:35:43 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1536 00:04:27.159 nr_hugepages=1536 00:04:27.159 10:35:43 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:27.159 resv_hugepages=0 00:04:27.159 10:35:43 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:27.159 surplus_hugepages=0 00:04:27.159 10:35:43 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:27.159 anon_hugepages=0 00:04:27.160 10:35:43 -- setup/hugepages.sh@107 -- # (( 1536 == nr_hugepages + surp + resv )) 00:04:27.160 10:35:43 -- setup/hugepages.sh@109 -- # (( 1536 == nr_hugepages )) 00:04:27.160 10:35:43 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:27.160 10:35:43 -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:27.160 10:35:43 -- setup/common.sh@18 -- # local node= 00:04:27.160 10:35:43 -- setup/common.sh@19 -- # local var val 00:04:27.160 10:35:43 -- setup/common.sh@20 -- # local mem_f mem 00:04:27.160 10:35:43 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:27.160 10:35:43 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:27.160 10:35:43 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:27.160 10:35:43 -- setup/common.sh@28 -- # mapfile -t mem 00:04:27.160 10:35:43 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:27.160 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.160 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.160 10:35:43 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 41685860 kB' 'MemAvailable: 44067956 kB' 'Buffers: 12536 kB' 'Cached: 11509032 kB' 'SwapCached: 16 kB' 'Active: 9749124 kB' 'Inactive: 2354388 kB' 'Active(anon): 9273772 kB' 'Inactive(anon): 57088 kB' 'Active(file): 475352 kB' 'Inactive(file): 2297300 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8387836 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 585128 kB' 'Mapped: 174332 kB' 'Shmem: 8748916 kB' 'KReclaimable: 245920 kB' 'Slab: 772336 kB' 'SReclaimable: 245920 kB' 'SUnreclaim: 526416 kB' 'KernelStack: 21792 kB' 'PageTables: 7880 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36963356 kB' 'Committed_AS: 10670556 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213412 kB' 'VmallocChunk: 0 kB' 'Percpu: 73472 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:04:27.160 10:35:43 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.160 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.160 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.160 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.160 10:35:43 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.160 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.160 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.160 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.160 10:35:43 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.160 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.160 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.160 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.160 10:35:43 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.160 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.160 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.160 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.160 10:35:43 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.160 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.160 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.160 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.160 10:35:43 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.160 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.160 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.160 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.160 10:35:43 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.160 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.160 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.160 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.160 10:35:43 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.160 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.160 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.160 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.160 10:35:43 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.160 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.160 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.160 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.160 10:35:43 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.160 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.160 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.160 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.160 10:35:43 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.160 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.160 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.160 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.160 10:35:43 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.160 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.160 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.160 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.160 10:35:43 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.160 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.160 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.160 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.160 10:35:43 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.160 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.160 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.160 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.160 10:35:43 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.160 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.160 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.160 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.160 10:35:43 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.160 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.160 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.160 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.160 10:35:43 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.160 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.160 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.160 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.160 10:35:43 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.160 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.160 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.160 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.160 10:35:43 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.160 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.160 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.160 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.160 10:35:43 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.160 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.160 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.160 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.160 10:35:43 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.160 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.160 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.160 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.160 10:35:43 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.160 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.160 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.160 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.160 10:35:43 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.160 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.160 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.160 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.160 10:35:43 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.160 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.160 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.160 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.160 10:35:43 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.160 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.160 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.160 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.160 10:35:43 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.160 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.160 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.161 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.161 10:35:43 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.161 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.161 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.161 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.161 10:35:43 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.161 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.161 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.161 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.161 10:35:43 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.161 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.161 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.161 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.161 10:35:43 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.161 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.161 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.161 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.161 10:35:43 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.161 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.161 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.161 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.161 10:35:43 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.161 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.161 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.161 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.161 10:35:43 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.161 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.161 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.161 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.161 10:35:43 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.161 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.161 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.161 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.161 10:35:43 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.161 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.161 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.161 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.161 10:35:43 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.161 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.161 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.161 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.161 10:35:43 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.161 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.161 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.161 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.161 10:35:43 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.161 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.161 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.161 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.161 10:35:43 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.161 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.161 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.161 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.161 10:35:43 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.161 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.161 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.161 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.161 10:35:43 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.161 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.161 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.161 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.161 10:35:43 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.161 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.161 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.161 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.161 10:35:43 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.161 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.161 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.161 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.161 10:35:43 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.161 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.161 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.161 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.161 10:35:43 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.161 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.161 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.161 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.161 10:35:43 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.161 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.161 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.161 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.161 10:35:43 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.161 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.161 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.161 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.161 10:35:43 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.161 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.161 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.161 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.161 10:35:43 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:27.161 10:35:43 -- setup/common.sh@33 -- # echo 1536 00:04:27.161 10:35:43 -- setup/common.sh@33 -- # return 0 00:04:27.161 10:35:43 -- setup/hugepages.sh@110 -- # (( 1536 == nr_hugepages + surp + resv )) 00:04:27.161 10:35:43 -- setup/hugepages.sh@112 -- # get_nodes 00:04:27.161 10:35:43 -- setup/hugepages.sh@27 -- # local node 00:04:27.161 10:35:43 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:27.161 10:35:43 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:27.161 10:35:43 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:27.161 10:35:43 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:27.161 10:35:43 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:27.161 10:35:43 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:27.161 10:35:43 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:27.161 10:35:43 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:27.161 10:35:43 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:27.161 10:35:43 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:27.161 10:35:43 -- setup/common.sh@18 -- # local node=0 00:04:27.161 10:35:43 -- setup/common.sh@19 -- # local var val 00:04:27.161 10:35:43 -- setup/common.sh@20 -- # local mem_f mem 00:04:27.161 10:35:43 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:27.161 10:35:43 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:27.161 10:35:43 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:27.161 10:35:43 -- setup/common.sh@28 -- # mapfile -t mem 00:04:27.161 10:35:43 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:27.161 10:35:43 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32592084 kB' 'MemFree: 26554608 kB' 'MemUsed: 6037476 kB' 'SwapCached: 16 kB' 'Active: 3389696 kB' 'Inactive: 180704 kB' 'Active(anon): 3173076 kB' 'Inactive(anon): 16 kB' 'Active(file): 216620 kB' 'Inactive(file): 180688 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3354000 kB' 'Mapped: 107024 kB' 'AnonPages: 219484 kB' 'Shmem: 2956676 kB' 'KernelStack: 12264 kB' 'PageTables: 3900 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 124976 kB' 'Slab: 365424 kB' 'SReclaimable: 124976 kB' 'SUnreclaim: 240448 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:27.161 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.161 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.161 10:35:43 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.161 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.161 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.161 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.161 10:35:43 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.161 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.161 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.161 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.161 10:35:43 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.161 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.161 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.161 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.161 10:35:43 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.161 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.161 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.161 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.161 10:35:43 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.161 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.161 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.161 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.161 10:35:43 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.161 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.161 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.161 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.161 10:35:43 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.161 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.161 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.161 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.161 10:35:43 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.161 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.161 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.161 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.161 10:35:43 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.161 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.161 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.161 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.161 10:35:43 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.161 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.161 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.162 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.162 10:35:43 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.162 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.162 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.162 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.162 10:35:43 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.162 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.162 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.162 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.162 10:35:43 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.162 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.162 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.162 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.162 10:35:43 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.162 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.162 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.162 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.162 10:35:43 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.162 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.162 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.162 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.162 10:35:43 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.162 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.162 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.162 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.162 10:35:43 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.162 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.162 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.162 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.162 10:35:43 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.162 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.162 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.162 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.162 10:35:43 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.162 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.162 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.162 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.162 10:35:43 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.162 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.162 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.162 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.162 10:35:43 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.162 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.162 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.162 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.162 10:35:43 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.162 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.162 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.162 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.162 10:35:43 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.162 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.162 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.162 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.162 10:35:43 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.162 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.162 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.162 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.162 10:35:43 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.162 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.162 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.162 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.162 10:35:43 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.162 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.162 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.162 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.162 10:35:43 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.162 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.162 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.162 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.162 10:35:43 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.162 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.162 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.162 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.162 10:35:43 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.162 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.162 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.162 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.162 10:35:43 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.162 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.162 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.162 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.162 10:35:43 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.162 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.162 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.162 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.162 10:35:43 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.162 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.162 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.162 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.162 10:35:43 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.162 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.162 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.162 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.162 10:35:43 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.162 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.162 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.162 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.162 10:35:43 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.162 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.162 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.162 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.162 10:35:43 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.162 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.162 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.162 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.162 10:35:43 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.162 10:35:43 -- setup/common.sh@33 -- # echo 0 00:04:27.162 10:35:43 -- setup/common.sh@33 -- # return 0 00:04:27.162 10:35:43 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:27.162 10:35:43 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:27.162 10:35:43 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:27.162 10:35:43 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:27.162 10:35:43 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:27.162 10:35:43 -- setup/common.sh@18 -- # local node=1 00:04:27.162 10:35:43 -- setup/common.sh@19 -- # local var val 00:04:27.162 10:35:43 -- setup/common.sh@20 -- # local mem_f mem 00:04:27.162 10:35:43 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:27.162 10:35:43 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:27.162 10:35:43 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:27.162 10:35:43 -- setup/common.sh@28 -- # mapfile -t mem 00:04:27.162 10:35:43 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:27.162 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.162 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.162 10:35:43 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27703148 kB' 'MemFree: 15131376 kB' 'MemUsed: 12571772 kB' 'SwapCached: 0 kB' 'Active: 6359836 kB' 'Inactive: 2173684 kB' 'Active(anon): 6101104 kB' 'Inactive(anon): 57072 kB' 'Active(file): 258732 kB' 'Inactive(file): 2116612 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8167600 kB' 'Mapped: 67308 kB' 'AnonPages: 366072 kB' 'Shmem: 5792256 kB' 'KernelStack: 9544 kB' 'PageTables: 4032 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 120944 kB' 'Slab: 406912 kB' 'SReclaimable: 120944 kB' 'SUnreclaim: 285968 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:27.162 10:35:43 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.162 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.162 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.162 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.162 10:35:43 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.162 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.162 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.162 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.162 10:35:43 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.162 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.162 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.162 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.162 10:35:43 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.162 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.162 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.162 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.162 10:35:43 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.162 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.162 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.162 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.162 10:35:43 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.162 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.162 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.162 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.162 10:35:43 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.162 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.162 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.162 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.162 10:35:43 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.162 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.162 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.162 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.162 10:35:43 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.162 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.162 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.162 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.162 10:35:43 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.162 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.162 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.163 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.163 10:35:43 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.163 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.163 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.163 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.163 10:35:43 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.163 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.163 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.163 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.163 10:35:43 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.163 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.163 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.163 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.163 10:35:43 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.163 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.163 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.163 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.163 10:35:43 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.163 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.163 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.163 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.163 10:35:43 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.163 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.163 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.163 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.163 10:35:43 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.163 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.163 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.163 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.163 10:35:43 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.163 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.163 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.163 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.163 10:35:43 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.163 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.163 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.163 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.163 10:35:43 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.163 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.163 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.163 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.163 10:35:43 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.163 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.163 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.163 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.163 10:35:43 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.163 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.163 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.163 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.163 10:35:43 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.163 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.163 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.163 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.163 10:35:43 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.163 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.163 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.163 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.163 10:35:43 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.163 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.163 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.163 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.163 10:35:43 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.163 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.163 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.163 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.163 10:35:43 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.163 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.163 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.163 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.163 10:35:43 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.163 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.163 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.163 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.163 10:35:43 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.163 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.163 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.163 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.163 10:35:43 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.163 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.163 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.163 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.163 10:35:43 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.163 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.163 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.163 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.163 10:35:43 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.163 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.163 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.163 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.163 10:35:43 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.163 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.163 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.163 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.163 10:35:43 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.163 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.163 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.163 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.163 10:35:43 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.163 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.163 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.163 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.163 10:35:43 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.163 10:35:43 -- setup/common.sh@32 -- # continue 00:04:27.163 10:35:43 -- setup/common.sh@31 -- # IFS=': ' 00:04:27.163 10:35:43 -- setup/common.sh@31 -- # read -r var val _ 00:04:27.163 10:35:43 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:27.163 10:35:43 -- setup/common.sh@33 -- # echo 0 00:04:27.163 10:35:43 -- setup/common.sh@33 -- # return 0 00:04:27.163 10:35:43 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:27.163 10:35:43 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:27.163 10:35:43 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:27.163 10:35:43 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:27.163 10:35:43 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:04:27.163 node0=512 expecting 512 00:04:27.163 10:35:43 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:27.163 10:35:43 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:27.163 10:35:43 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:27.163 10:35:43 -- setup/hugepages.sh@128 -- # echo 'node1=1024 expecting 1024' 00:04:27.163 node1=1024 expecting 1024 00:04:27.163 10:35:43 -- setup/hugepages.sh@130 -- # [[ 512,1024 == \5\1\2\,\1\0\2\4 ]] 00:04:27.163 00:04:27.163 real 0m3.209s 00:04:27.163 user 0m1.112s 00:04:27.163 sys 0m2.101s 00:04:27.163 10:35:43 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:27.163 10:35:43 -- common/autotest_common.sh@10 -- # set +x 00:04:27.163 ************************************ 00:04:27.163 END TEST custom_alloc 00:04:27.163 ************************************ 00:04:27.163 10:35:43 -- setup/hugepages.sh@215 -- # run_test no_shrink_alloc no_shrink_alloc 00:04:27.163 10:35:43 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:27.163 10:35:43 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:27.163 10:35:43 -- common/autotest_common.sh@10 -- # set +x 00:04:27.163 ************************************ 00:04:27.163 START TEST no_shrink_alloc 00:04:27.163 ************************************ 00:04:27.163 10:35:43 -- common/autotest_common.sh@1104 -- # no_shrink_alloc 00:04:27.163 10:35:43 -- setup/hugepages.sh@195 -- # get_test_nr_hugepages 2097152 0 00:04:27.163 10:35:43 -- setup/hugepages.sh@49 -- # local size=2097152 00:04:27.163 10:35:43 -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:04:27.163 10:35:43 -- setup/hugepages.sh@51 -- # shift 00:04:27.163 10:35:43 -- setup/hugepages.sh@52 -- # node_ids=('0') 00:04:27.163 10:35:43 -- setup/hugepages.sh@52 -- # local node_ids 00:04:27.163 10:35:43 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:27.163 10:35:43 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:27.163 10:35:43 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:04:27.163 10:35:43 -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:04:27.163 10:35:43 -- setup/hugepages.sh@62 -- # local user_nodes 00:04:27.163 10:35:43 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:27.163 10:35:43 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:27.163 10:35:43 -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:27.163 10:35:43 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:27.163 10:35:43 -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:04:27.163 10:35:43 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:04:27.163 10:35:43 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:04:27.163 10:35:43 -- setup/hugepages.sh@73 -- # return 0 00:04:27.163 10:35:43 -- setup/hugepages.sh@198 -- # setup output 00:04:27.163 10:35:43 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:27.163 10:35:43 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:30.454 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:30.454 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:30.454 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:30.454 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:30.454 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:30.454 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:30.454 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:30.454 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:30.454 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:30.454 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:30.454 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:30.454 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:30.454 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:30.454 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:30.454 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:30.454 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:30.454 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:30.454 10:35:46 -- setup/hugepages.sh@199 -- # verify_nr_hugepages 00:04:30.454 10:35:46 -- setup/hugepages.sh@89 -- # local node 00:04:30.454 10:35:46 -- setup/hugepages.sh@90 -- # local sorted_t 00:04:30.454 10:35:46 -- setup/hugepages.sh@91 -- # local sorted_s 00:04:30.454 10:35:46 -- setup/hugepages.sh@92 -- # local surp 00:04:30.454 10:35:46 -- setup/hugepages.sh@93 -- # local resv 00:04:30.454 10:35:46 -- setup/hugepages.sh@94 -- # local anon 00:04:30.454 10:35:46 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:30.454 10:35:46 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:30.454 10:35:46 -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:30.454 10:35:46 -- setup/common.sh@18 -- # local node= 00:04:30.454 10:35:46 -- setup/common.sh@19 -- # local var val 00:04:30.454 10:35:46 -- setup/common.sh@20 -- # local mem_f mem 00:04:30.454 10:35:46 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:30.455 10:35:46 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:30.455 10:35:46 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:30.455 10:35:46 -- setup/common.sh@28 -- # mapfile -t mem 00:04:30.455 10:35:46 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:30.455 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.455 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.455 10:35:46 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 42727100 kB' 'MemAvailable: 45109196 kB' 'Buffers: 12536 kB' 'Cached: 11509120 kB' 'SwapCached: 16 kB' 'Active: 9757032 kB' 'Inactive: 2354388 kB' 'Active(anon): 9281680 kB' 'Inactive(anon): 57088 kB' 'Active(file): 475352 kB' 'Inactive(file): 2297300 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8387836 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 592556 kB' 'Mapped: 174840 kB' 'Shmem: 8749004 kB' 'KReclaimable: 245920 kB' 'Slab: 772664 kB' 'SReclaimable: 245920 kB' 'SUnreclaim: 526744 kB' 'KernelStack: 22016 kB' 'PageTables: 8508 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 10681804 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213656 kB' 'VmallocChunk: 0 kB' 'Percpu: 73472 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:04:30.455 10:35:46 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.455 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.455 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.455 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.455 10:35:46 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.455 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.455 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.455 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.455 10:35:46 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.455 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.455 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.455 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.455 10:35:46 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.455 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.455 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.455 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.455 10:35:46 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.455 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.455 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.455 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.455 10:35:46 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.455 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.455 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.455 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.455 10:35:46 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.455 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.455 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.455 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.455 10:35:46 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.455 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.455 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.455 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.455 10:35:46 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.455 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.455 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.455 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.455 10:35:46 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.455 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.455 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.455 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.455 10:35:46 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.455 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.455 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.455 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.455 10:35:46 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.455 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.455 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.455 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.455 10:35:46 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.455 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.455 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.455 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.455 10:35:46 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.455 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.455 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.455 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.455 10:35:46 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.455 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.455 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.455 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.455 10:35:46 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.455 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.455 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.455 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.455 10:35:46 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.455 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.455 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.455 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.455 10:35:46 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.455 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.455 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.455 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.455 10:35:46 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.455 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.455 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.455 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.455 10:35:46 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.455 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.455 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.455 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.455 10:35:46 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.455 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.455 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.455 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.455 10:35:46 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.455 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.455 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.455 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.455 10:35:46 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.455 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.455 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.455 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.455 10:35:46 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.455 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.455 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.455 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.455 10:35:46 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.455 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.455 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.455 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.455 10:35:46 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.455 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.455 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.455 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.455 10:35:46 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.455 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.455 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.455 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.455 10:35:46 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.455 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.455 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.455 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.455 10:35:46 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.455 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.455 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.455 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.455 10:35:46 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.455 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.455 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.455 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.455 10:35:46 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.455 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.455 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.455 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.455 10:35:46 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.455 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.455 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.455 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.455 10:35:46 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.455 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.455 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.455 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.455 10:35:46 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.455 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.455 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.455 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.455 10:35:46 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.455 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.455 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.455 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.455 10:35:46 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.455 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.455 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.455 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.455 10:35:46 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.455 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.456 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.456 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.456 10:35:46 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.456 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.456 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.456 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.456 10:35:46 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.456 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.456 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.456 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.456 10:35:46 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.456 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.456 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.456 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.456 10:35:46 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:30.456 10:35:46 -- setup/common.sh@33 -- # echo 0 00:04:30.456 10:35:46 -- setup/common.sh@33 -- # return 0 00:04:30.456 10:35:46 -- setup/hugepages.sh@97 -- # anon=0 00:04:30.717 10:35:46 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:30.717 10:35:46 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:30.717 10:35:46 -- setup/common.sh@18 -- # local node= 00:04:30.717 10:35:46 -- setup/common.sh@19 -- # local var val 00:04:30.717 10:35:46 -- setup/common.sh@20 -- # local mem_f mem 00:04:30.717 10:35:46 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:30.717 10:35:46 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:30.717 10:35:46 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:30.717 10:35:46 -- setup/common.sh@28 -- # mapfile -t mem 00:04:30.717 10:35:46 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:30.717 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.717 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.717 10:35:46 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 42732460 kB' 'MemAvailable: 45114556 kB' 'Buffers: 12536 kB' 'Cached: 11509132 kB' 'SwapCached: 16 kB' 'Active: 9752484 kB' 'Inactive: 2354388 kB' 'Active(anon): 9277132 kB' 'Inactive(anon): 57088 kB' 'Active(file): 475352 kB' 'Inactive(file): 2297300 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8387836 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 587956 kB' 'Mapped: 174960 kB' 'Shmem: 8749016 kB' 'KReclaimable: 245920 kB' 'Slab: 772672 kB' 'SReclaimable: 245920 kB' 'SUnreclaim: 526752 kB' 'KernelStack: 21968 kB' 'PageTables: 8652 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 10677712 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213572 kB' 'VmallocChunk: 0 kB' 'Percpu: 73472 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:04:30.717 10:35:46 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.717 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.717 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.717 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.717 10:35:46 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.717 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.717 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.717 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.717 10:35:46 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.717 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.717 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.717 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.717 10:35:46 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.717 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.717 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.717 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.718 10:35:46 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.718 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.718 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.718 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.718 10:35:46 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.718 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.718 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.718 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.718 10:35:46 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.718 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.718 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.718 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.718 10:35:46 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.718 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.718 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.718 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.718 10:35:46 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.718 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.718 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.718 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.718 10:35:46 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.718 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.718 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.718 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.718 10:35:46 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.718 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.718 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.718 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.718 10:35:46 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.718 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.718 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.718 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.718 10:35:46 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.718 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.718 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.718 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.718 10:35:46 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.718 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.718 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.718 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.718 10:35:46 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.718 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.718 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.718 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.718 10:35:46 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.718 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.718 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.718 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.718 10:35:46 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.718 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.718 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.718 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.718 10:35:46 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.718 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.718 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.718 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.718 10:35:46 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.718 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.718 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.718 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.718 10:35:46 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.718 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.718 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.718 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.718 10:35:46 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.718 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.718 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.718 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.718 10:35:46 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.718 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.718 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.718 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.718 10:35:46 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.718 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.718 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.718 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.718 10:35:46 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.718 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.718 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.718 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.718 10:35:46 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.718 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.718 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.718 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.718 10:35:46 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.718 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.718 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.718 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.718 10:35:46 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.718 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.718 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.718 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.718 10:35:46 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.718 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.718 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.718 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.718 10:35:46 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.718 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.718 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.718 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.718 10:35:46 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.718 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.718 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.718 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.718 10:35:46 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.718 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.718 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.718 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.718 10:35:46 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.718 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.718 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.718 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.718 10:35:46 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.718 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.718 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.718 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.718 10:35:46 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.718 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.718 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.718 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.718 10:35:46 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.718 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.718 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.718 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.718 10:35:46 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.718 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.718 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.718 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.718 10:35:46 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.718 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.718 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.718 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.718 10:35:46 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.718 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.718 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.718 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.718 10:35:46 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.718 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.718 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.718 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.718 10:35:46 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.718 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.718 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.718 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.718 10:35:46 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.718 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.718 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.718 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.718 10:35:46 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.718 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.718 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.718 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.718 10:35:46 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.718 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.718 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.718 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.718 10:35:46 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.718 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.719 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.719 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.719 10:35:46 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.719 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.719 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.719 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.719 10:35:46 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.719 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.719 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.719 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.719 10:35:46 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.719 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.719 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.719 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.719 10:35:46 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.719 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.719 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.719 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.719 10:35:46 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.719 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.719 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.719 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.719 10:35:46 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.719 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.719 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.719 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.719 10:35:46 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.719 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.719 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.719 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.719 10:35:46 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.719 10:35:46 -- setup/common.sh@33 -- # echo 0 00:04:30.719 10:35:46 -- setup/common.sh@33 -- # return 0 00:04:30.719 10:35:46 -- setup/hugepages.sh@99 -- # surp=0 00:04:30.719 10:35:46 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:30.719 10:35:46 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:30.719 10:35:46 -- setup/common.sh@18 -- # local node= 00:04:30.719 10:35:46 -- setup/common.sh@19 -- # local var val 00:04:30.719 10:35:46 -- setup/common.sh@20 -- # local mem_f mem 00:04:30.719 10:35:46 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:30.719 10:35:46 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:30.719 10:35:46 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:30.719 10:35:46 -- setup/common.sh@28 -- # mapfile -t mem 00:04:30.719 10:35:46 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:30.719 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.719 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.719 10:35:46 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 42726020 kB' 'MemAvailable: 45108116 kB' 'Buffers: 12536 kB' 'Cached: 11509136 kB' 'SwapCached: 16 kB' 'Active: 9756312 kB' 'Inactive: 2354388 kB' 'Active(anon): 9280960 kB' 'Inactive(anon): 57088 kB' 'Active(file): 475352 kB' 'Inactive(file): 2297300 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8387836 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 591808 kB' 'Mapped: 174840 kB' 'Shmem: 8749020 kB' 'KReclaimable: 245920 kB' 'Slab: 772624 kB' 'SReclaimable: 245920 kB' 'SUnreclaim: 526704 kB' 'KernelStack: 21968 kB' 'PageTables: 8104 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 10681832 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213512 kB' 'VmallocChunk: 0 kB' 'Percpu: 73472 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:04:30.719 10:35:46 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.719 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.719 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.719 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.719 10:35:46 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.719 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.719 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.719 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.719 10:35:46 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.719 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.719 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.719 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.719 10:35:46 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.719 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.719 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.719 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.719 10:35:46 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.719 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.719 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.719 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.719 10:35:46 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.719 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.719 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.719 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.719 10:35:46 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.719 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.719 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.719 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.719 10:35:46 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.719 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.719 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.719 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.719 10:35:46 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.719 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.719 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.719 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.719 10:35:46 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.719 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.719 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.719 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.719 10:35:46 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.719 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.719 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.719 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.719 10:35:46 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.719 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.719 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.719 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.719 10:35:46 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.719 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.719 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.719 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.719 10:35:46 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.719 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.719 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.719 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.719 10:35:46 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.719 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.719 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.719 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.719 10:35:46 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.719 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.719 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.719 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.719 10:35:46 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.719 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.719 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.719 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.719 10:35:46 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.719 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.719 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.719 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.719 10:35:46 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.719 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.719 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.719 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.719 10:35:46 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.719 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.719 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.719 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.719 10:35:46 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.719 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.719 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.719 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.719 10:35:46 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.719 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.719 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.719 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.719 10:35:46 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.719 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.720 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.720 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.720 10:35:46 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.720 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.720 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.720 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.720 10:35:46 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.720 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.720 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.720 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.720 10:35:46 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.720 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.720 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.720 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.720 10:35:46 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.720 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.720 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.720 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.720 10:35:46 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.720 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.720 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.720 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.720 10:35:46 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.720 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.720 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.720 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.720 10:35:46 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.720 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.720 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.720 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.720 10:35:46 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.720 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.720 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.720 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.720 10:35:46 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.720 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.720 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.720 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.720 10:35:46 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.720 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.720 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.720 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.720 10:35:46 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.720 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.720 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.720 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.720 10:35:46 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.720 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.720 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.720 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.720 10:35:46 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.720 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.720 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.720 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.720 10:35:46 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.720 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.720 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.720 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.720 10:35:46 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.720 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.720 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.720 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.720 10:35:46 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.720 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.720 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.720 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.720 10:35:46 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.720 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.720 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.720 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.720 10:35:46 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.720 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.720 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.720 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.720 10:35:46 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.720 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.720 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.720 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.720 10:35:46 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.720 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.720 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.720 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.720 10:35:46 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.720 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.720 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.720 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.720 10:35:46 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.720 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.720 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.720 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.720 10:35:46 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.720 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.720 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.720 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.720 10:35:46 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.720 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.720 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.720 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.720 10:35:46 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.720 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.720 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.720 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.720 10:35:46 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.720 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.720 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.720 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.720 10:35:46 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.720 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.720 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.720 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.720 10:35:46 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:30.720 10:35:46 -- setup/common.sh@33 -- # echo 0 00:04:30.720 10:35:46 -- setup/common.sh@33 -- # return 0 00:04:30.720 10:35:46 -- setup/hugepages.sh@100 -- # resv=0 00:04:30.720 10:35:46 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:30.720 nr_hugepages=1024 00:04:30.720 10:35:46 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:30.720 resv_hugepages=0 00:04:30.720 10:35:46 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:30.720 surplus_hugepages=0 00:04:30.720 10:35:46 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:30.720 anon_hugepages=0 00:04:30.720 10:35:46 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:30.720 10:35:46 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:30.720 10:35:46 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:30.720 10:35:46 -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:30.720 10:35:46 -- setup/common.sh@18 -- # local node= 00:04:30.720 10:35:46 -- setup/common.sh@19 -- # local var val 00:04:30.720 10:35:46 -- setup/common.sh@20 -- # local mem_f mem 00:04:30.720 10:35:46 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:30.720 10:35:46 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:30.720 10:35:46 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:30.720 10:35:46 -- setup/common.sh@28 -- # mapfile -t mem 00:04:30.720 10:35:46 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:30.720 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.720 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.720 10:35:46 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 42728508 kB' 'MemAvailable: 45110604 kB' 'Buffers: 12536 kB' 'Cached: 11509148 kB' 'SwapCached: 16 kB' 'Active: 9751604 kB' 'Inactive: 2354388 kB' 'Active(anon): 9276252 kB' 'Inactive(anon): 57088 kB' 'Active(file): 475352 kB' 'Inactive(file): 2297300 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8387836 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 587576 kB' 'Mapped: 174336 kB' 'Shmem: 8749032 kB' 'KReclaimable: 245920 kB' 'Slab: 773040 kB' 'SReclaimable: 245920 kB' 'SUnreclaim: 527120 kB' 'KernelStack: 21872 kB' 'PageTables: 8096 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 10675724 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213604 kB' 'VmallocChunk: 0 kB' 'Percpu: 73472 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:04:30.720 10:35:46 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.720 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.720 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.720 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.720 10:35:46 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.720 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.720 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.720 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.720 10:35:46 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.720 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.720 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.720 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.720 10:35:46 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.720 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.720 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.720 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.720 10:35:46 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.720 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.720 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.721 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.721 10:35:46 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.721 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.721 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.721 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.721 10:35:46 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.721 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.721 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.721 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.721 10:35:46 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.721 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.721 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.721 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.721 10:35:46 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.721 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.721 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.721 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.721 10:35:46 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.721 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.721 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.721 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.721 10:35:46 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.721 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.721 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.721 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.721 10:35:46 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.721 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.721 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.721 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.721 10:35:46 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.721 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.721 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.721 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.721 10:35:46 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.721 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.721 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.721 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.721 10:35:46 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.721 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.721 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.721 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.721 10:35:46 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.721 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.721 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.721 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.721 10:35:46 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.721 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.721 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.721 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.721 10:35:46 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.721 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.721 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.721 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.721 10:35:46 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.721 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.721 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.721 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.721 10:35:46 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.721 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.721 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.721 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.721 10:35:46 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.721 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.721 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.721 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.721 10:35:46 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.721 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.721 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.721 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.721 10:35:46 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.721 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.721 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.721 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.721 10:35:46 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.721 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.721 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.721 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.721 10:35:46 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.721 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.721 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.721 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.721 10:35:46 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.721 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.721 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.721 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.721 10:35:46 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.721 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.721 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.721 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.721 10:35:46 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.721 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.721 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.721 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.721 10:35:46 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.721 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.721 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.721 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.721 10:35:46 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.721 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.721 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.721 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.721 10:35:46 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.721 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.721 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.721 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.721 10:35:46 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.721 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.721 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.721 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.721 10:35:46 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.721 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.721 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.721 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.721 10:35:46 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.721 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.721 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.721 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.721 10:35:46 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.721 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.721 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.721 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.721 10:35:46 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.721 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.721 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.721 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.721 10:35:46 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.721 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.721 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.721 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.721 10:35:46 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.721 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.721 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.721 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.721 10:35:46 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.721 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.721 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.721 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.721 10:35:46 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.721 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.721 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.721 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.721 10:35:46 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.721 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.721 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.721 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.721 10:35:46 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.721 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.721 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.721 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.721 10:35:46 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.721 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.721 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.721 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.721 10:35:46 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.721 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.721 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.721 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.721 10:35:46 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.721 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.721 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.721 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.721 10:35:46 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.721 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.721 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.721 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.721 10:35:46 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.721 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.721 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.721 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.721 10:35:46 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.721 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.721 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.721 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.721 10:35:46 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:30.722 10:35:46 -- setup/common.sh@33 -- # echo 1024 00:04:30.722 10:35:46 -- setup/common.sh@33 -- # return 0 00:04:30.722 10:35:46 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:30.722 10:35:46 -- setup/hugepages.sh@112 -- # get_nodes 00:04:30.722 10:35:46 -- setup/hugepages.sh@27 -- # local node 00:04:30.722 10:35:46 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:30.722 10:35:46 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:30.722 10:35:46 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:30.722 10:35:46 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:04:30.722 10:35:46 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:30.722 10:35:46 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:30.722 10:35:46 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:30.722 10:35:46 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:30.722 10:35:46 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:30.722 10:35:46 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:30.722 10:35:46 -- setup/common.sh@18 -- # local node=0 00:04:30.722 10:35:46 -- setup/common.sh@19 -- # local var val 00:04:30.722 10:35:46 -- setup/common.sh@20 -- # local mem_f mem 00:04:30.722 10:35:46 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:30.722 10:35:46 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:30.722 10:35:46 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:30.722 10:35:46 -- setup/common.sh@28 -- # mapfile -t mem 00:04:30.722 10:35:46 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:30.722 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.722 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.722 10:35:46 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32592084 kB' 'MemFree: 25517288 kB' 'MemUsed: 7074796 kB' 'SwapCached: 16 kB' 'Active: 3389892 kB' 'Inactive: 180704 kB' 'Active(anon): 3173272 kB' 'Inactive(anon): 16 kB' 'Active(file): 216620 kB' 'Inactive(file): 180688 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3354044 kB' 'Mapped: 107028 kB' 'AnonPages: 219728 kB' 'Shmem: 2956720 kB' 'KernelStack: 12232 kB' 'PageTables: 3800 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 124976 kB' 'Slab: 365960 kB' 'SReclaimable: 124976 kB' 'SUnreclaim: 240984 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:30.722 10:35:46 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.722 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.722 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.722 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.722 10:35:46 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.722 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.722 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.722 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.722 10:35:46 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.722 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.722 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.722 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.722 10:35:46 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.722 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.722 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.722 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.722 10:35:46 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.722 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.722 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.722 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.722 10:35:46 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.722 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.722 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.722 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.722 10:35:46 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.722 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.722 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.722 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.722 10:35:46 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.722 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.722 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.722 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.722 10:35:46 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.722 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.722 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.722 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.722 10:35:46 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.722 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.722 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.722 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.722 10:35:46 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.722 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.722 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.722 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.722 10:35:46 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.722 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.722 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.722 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.722 10:35:46 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.722 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.722 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.722 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.722 10:35:46 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.722 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.722 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.722 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.722 10:35:46 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.722 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.722 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.722 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.722 10:35:46 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.722 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.722 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.722 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.722 10:35:46 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.722 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.722 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.722 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.722 10:35:46 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.722 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.722 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.722 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.722 10:35:46 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.722 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.722 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.722 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.722 10:35:46 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.722 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.722 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.722 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.722 10:35:46 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.722 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.722 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.722 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.722 10:35:46 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.722 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.722 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.722 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.722 10:35:46 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.722 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.722 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.722 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.722 10:35:46 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.722 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.722 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.722 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.722 10:35:46 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.722 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.722 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.723 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.723 10:35:46 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.723 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.723 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.723 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.723 10:35:46 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.723 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.723 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.723 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.723 10:35:46 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.723 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.723 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.723 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.723 10:35:46 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.723 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.723 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.723 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.723 10:35:46 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.723 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.723 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.723 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.723 10:35:46 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.723 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.723 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.723 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.723 10:35:46 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.723 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.723 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.723 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.723 10:35:46 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.723 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.723 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.723 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.723 10:35:46 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.723 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.723 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.723 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.723 10:35:46 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.723 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.723 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.723 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.723 10:35:46 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.723 10:35:46 -- setup/common.sh@32 -- # continue 00:04:30.723 10:35:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:30.723 10:35:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:30.723 10:35:46 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:30.723 10:35:46 -- setup/common.sh@33 -- # echo 0 00:04:30.723 10:35:46 -- setup/common.sh@33 -- # return 0 00:04:30.723 10:35:46 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:30.723 10:35:46 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:30.723 10:35:46 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:30.723 10:35:46 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:30.723 10:35:46 -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:04:30.723 node0=1024 expecting 1024 00:04:30.723 10:35:46 -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:04:30.723 10:35:46 -- setup/hugepages.sh@202 -- # CLEAR_HUGE=no 00:04:30.723 10:35:46 -- setup/hugepages.sh@202 -- # NRHUGE=512 00:04:30.723 10:35:46 -- setup/hugepages.sh@202 -- # setup output 00:04:30.723 10:35:46 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:30.723 10:35:46 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:34.016 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:34.016 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:34.016 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:34.016 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:34.016 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:34.016 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:34.016 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:34.016 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:34.016 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:34.016 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:34.016 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:34.016 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:34.016 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:34.016 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:34.016 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:34.016 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:34.016 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:34.016 INFO: Requested 512 hugepages but 1024 already allocated on node0 00:04:34.016 10:35:50 -- setup/hugepages.sh@204 -- # verify_nr_hugepages 00:04:34.016 10:35:50 -- setup/hugepages.sh@89 -- # local node 00:04:34.016 10:35:50 -- setup/hugepages.sh@90 -- # local sorted_t 00:04:34.016 10:35:50 -- setup/hugepages.sh@91 -- # local sorted_s 00:04:34.016 10:35:50 -- setup/hugepages.sh@92 -- # local surp 00:04:34.016 10:35:50 -- setup/hugepages.sh@93 -- # local resv 00:04:34.016 10:35:50 -- setup/hugepages.sh@94 -- # local anon 00:04:34.016 10:35:50 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:34.016 10:35:50 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:34.016 10:35:50 -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:34.016 10:35:50 -- setup/common.sh@18 -- # local node= 00:04:34.016 10:35:50 -- setup/common.sh@19 -- # local var val 00:04:34.016 10:35:50 -- setup/common.sh@20 -- # local mem_f mem 00:04:34.016 10:35:50 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:34.016 10:35:50 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:34.016 10:35:50 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:34.016 10:35:50 -- setup/common.sh@28 -- # mapfile -t mem 00:04:34.016 10:35:50 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:34.016 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.016 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.016 10:35:50 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 42751144 kB' 'MemAvailable: 45133224 kB' 'Buffers: 12536 kB' 'Cached: 11509240 kB' 'SwapCached: 16 kB' 'Active: 9757468 kB' 'Inactive: 2354388 kB' 'Active(anon): 9282116 kB' 'Inactive(anon): 57088 kB' 'Active(file): 475352 kB' 'Inactive(file): 2297300 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8387836 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 593240 kB' 'Mapped: 175332 kB' 'Shmem: 8749124 kB' 'KReclaimable: 245888 kB' 'Slab: 772004 kB' 'SReclaimable: 245888 kB' 'SUnreclaim: 526116 kB' 'KernelStack: 21856 kB' 'PageTables: 8128 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 10677920 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213288 kB' 'VmallocChunk: 0 kB' 'Percpu: 73472 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:04:34.016 10:35:50 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.016 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.016 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.016 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.016 10:35:50 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.016 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.016 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.016 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.016 10:35:50 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.016 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.016 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.016 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.016 10:35:50 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.016 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.016 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.016 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.016 10:35:50 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.016 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.016 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.016 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.016 10:35:50 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.016 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.016 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.016 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.016 10:35:50 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.016 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.016 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.016 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.016 10:35:50 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.016 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.016 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.016 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.016 10:35:50 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.016 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.016 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.016 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.016 10:35:50 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.016 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.016 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.016 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.016 10:35:50 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.016 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.016 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.016 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.016 10:35:50 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.016 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.016 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.016 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.016 10:35:50 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.016 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.016 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.016 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.016 10:35:50 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.016 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.016 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.016 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.016 10:35:50 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.016 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.016 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.016 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.016 10:35:50 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.016 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.016 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.016 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.016 10:35:50 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.016 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.016 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.016 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.016 10:35:50 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.016 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.016 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.016 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.016 10:35:50 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.016 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.016 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.016 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.016 10:35:50 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.016 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.016 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.016 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.016 10:35:50 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.016 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.016 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.016 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.016 10:35:50 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.016 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.016 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.016 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.016 10:35:50 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.016 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.016 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.016 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.016 10:35:50 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.016 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.016 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.016 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.016 10:35:50 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.016 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.016 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.016 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.016 10:35:50 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.016 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.016 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.016 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.016 10:35:50 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.016 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.016 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.016 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.016 10:35:50 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.016 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.016 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.016 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.016 10:35:50 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.016 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.016 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.016 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.016 10:35:50 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.016 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.016 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.016 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.016 10:35:50 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.016 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.016 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.016 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.016 10:35:50 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.016 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.016 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.016 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.016 10:35:50 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.016 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.016 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.016 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.016 10:35:50 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.016 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.016 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.016 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.016 10:35:50 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.016 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.016 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.016 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.016 10:35:50 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.016 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.016 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.016 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.016 10:35:50 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.016 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.016 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.016 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.016 10:35:50 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.016 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.016 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.016 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.016 10:35:50 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.016 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.016 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.016 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.016 10:35:50 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.016 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.016 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.016 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.016 10:35:50 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.016 10:35:50 -- setup/common.sh@33 -- # echo 0 00:04:34.016 10:35:50 -- setup/common.sh@33 -- # return 0 00:04:34.016 10:35:50 -- setup/hugepages.sh@97 -- # anon=0 00:04:34.016 10:35:50 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:34.016 10:35:50 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:34.016 10:35:50 -- setup/common.sh@18 -- # local node= 00:04:34.016 10:35:50 -- setup/common.sh@19 -- # local var val 00:04:34.016 10:35:50 -- setup/common.sh@20 -- # local mem_f mem 00:04:34.016 10:35:50 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:34.016 10:35:50 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:34.016 10:35:50 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:34.016 10:35:50 -- setup/common.sh@28 -- # mapfile -t mem 00:04:34.016 10:35:50 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:34.016 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.016 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.017 10:35:50 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 42751812 kB' 'MemAvailable: 45133892 kB' 'Buffers: 12536 kB' 'Cached: 11509244 kB' 'SwapCached: 16 kB' 'Active: 9751304 kB' 'Inactive: 2354388 kB' 'Active(anon): 9275952 kB' 'Inactive(anon): 57088 kB' 'Active(file): 475352 kB' 'Inactive(file): 2297300 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8387836 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 587576 kB' 'Mapped: 174336 kB' 'Shmem: 8749128 kB' 'KReclaimable: 245888 kB' 'Slab: 771948 kB' 'SReclaimable: 245888 kB' 'SUnreclaim: 526060 kB' 'KernelStack: 21824 kB' 'PageTables: 7980 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 10671812 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213268 kB' 'VmallocChunk: 0 kB' 'Percpu: 73472 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.017 10:35:50 -- setup/common.sh@33 -- # echo 0 00:04:34.017 10:35:50 -- setup/common.sh@33 -- # return 0 00:04:34.017 10:35:50 -- setup/hugepages.sh@99 -- # surp=0 00:04:34.017 10:35:50 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:34.017 10:35:50 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:34.017 10:35:50 -- setup/common.sh@18 -- # local node= 00:04:34.017 10:35:50 -- setup/common.sh@19 -- # local var val 00:04:34.017 10:35:50 -- setup/common.sh@20 -- # local mem_f mem 00:04:34.017 10:35:50 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:34.017 10:35:50 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:34.017 10:35:50 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:34.017 10:35:50 -- setup/common.sh@28 -- # mapfile -t mem 00:04:34.017 10:35:50 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.017 10:35:50 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 42751812 kB' 'MemAvailable: 45133892 kB' 'Buffers: 12536 kB' 'Cached: 11509256 kB' 'SwapCached: 16 kB' 'Active: 9751356 kB' 'Inactive: 2354388 kB' 'Active(anon): 9276004 kB' 'Inactive(anon): 57088 kB' 'Active(file): 475352 kB' 'Inactive(file): 2297300 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8387836 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 587672 kB' 'Mapped: 174336 kB' 'Shmem: 8749140 kB' 'KReclaimable: 245888 kB' 'Slab: 771948 kB' 'SReclaimable: 245888 kB' 'SUnreclaim: 526060 kB' 'KernelStack: 21840 kB' 'PageTables: 8032 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 10671828 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213268 kB' 'VmallocChunk: 0 kB' 'Percpu: 73472 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.017 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.017 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.018 10:35:50 -- setup/common.sh@33 -- # echo 0 00:04:34.018 10:35:50 -- setup/common.sh@33 -- # return 0 00:04:34.018 10:35:50 -- setup/hugepages.sh@100 -- # resv=0 00:04:34.018 10:35:50 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:34.018 nr_hugepages=1024 00:04:34.018 10:35:50 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:34.018 resv_hugepages=0 00:04:34.018 10:35:50 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:34.018 surplus_hugepages=0 00:04:34.018 10:35:50 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:34.018 anon_hugepages=0 00:04:34.018 10:35:50 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:34.018 10:35:50 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:34.018 10:35:50 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:34.018 10:35:50 -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:34.018 10:35:50 -- setup/common.sh@18 -- # local node= 00:04:34.018 10:35:50 -- setup/common.sh@19 -- # local var val 00:04:34.018 10:35:50 -- setup/common.sh@20 -- # local mem_f mem 00:04:34.018 10:35:50 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:34.018 10:35:50 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:34.018 10:35:50 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:34.018 10:35:50 -- setup/common.sh@28 -- # mapfile -t mem 00:04:34.018 10:35:50 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.018 10:35:50 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60295232 kB' 'MemFree: 42751560 kB' 'MemAvailable: 45133640 kB' 'Buffers: 12536 kB' 'Cached: 11509268 kB' 'SwapCached: 16 kB' 'Active: 9751368 kB' 'Inactive: 2354388 kB' 'Active(anon): 9276016 kB' 'Inactive(anon): 57088 kB' 'Active(file): 475352 kB' 'Inactive(file): 2297300 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8387836 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 587672 kB' 'Mapped: 174336 kB' 'Shmem: 8749152 kB' 'KReclaimable: 245888 kB' 'Slab: 771948 kB' 'SReclaimable: 245888 kB' 'SUnreclaim: 526060 kB' 'KernelStack: 21840 kB' 'PageTables: 8032 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37487644 kB' 'Committed_AS: 10671844 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 213268 kB' 'VmallocChunk: 0 kB' 'Percpu: 73472 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 484724 kB' 'DirectMap2M: 8638464 kB' 'DirectMap1G: 59768832 kB' 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.018 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.018 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.019 10:35:50 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.019 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.019 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.019 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.019 10:35:50 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.019 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.019 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.019 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.019 10:35:50 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.019 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.019 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.019 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.019 10:35:50 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.019 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.019 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.019 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.019 10:35:50 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.019 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.019 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.019 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.019 10:35:50 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.019 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.019 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.019 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.019 10:35:50 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.019 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.019 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.019 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.019 10:35:50 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.019 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.019 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.019 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.019 10:35:50 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.019 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.019 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.019 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.019 10:35:50 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.019 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.019 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.019 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.019 10:35:50 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.019 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.019 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.019 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.019 10:35:50 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.019 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.019 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.019 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.019 10:35:50 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.019 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.019 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.019 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.277 10:35:50 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.277 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.277 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.277 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.277 10:35:50 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.277 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.277 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.277 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.277 10:35:50 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.277 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.277 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.277 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.277 10:35:50 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.277 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.277 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.277 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.277 10:35:50 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.277 10:35:50 -- setup/common.sh@33 -- # echo 1024 00:04:34.277 10:35:50 -- setup/common.sh@33 -- # return 0 00:04:34.278 10:35:50 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:34.278 10:35:50 -- setup/hugepages.sh@112 -- # get_nodes 00:04:34.278 10:35:50 -- setup/hugepages.sh@27 -- # local node 00:04:34.278 10:35:50 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:34.278 10:35:50 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:34.278 10:35:50 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:34.278 10:35:50 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:04:34.278 10:35:50 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:34.278 10:35:50 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:34.278 10:35:50 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:34.278 10:35:50 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:34.278 10:35:50 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:34.278 10:35:50 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:34.278 10:35:50 -- setup/common.sh@18 -- # local node=0 00:04:34.278 10:35:50 -- setup/common.sh@19 -- # local var val 00:04:34.278 10:35:50 -- setup/common.sh@20 -- # local mem_f mem 00:04:34.278 10:35:50 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:34.278 10:35:50 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:34.278 10:35:50 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:34.278 10:35:50 -- setup/common.sh@28 -- # mapfile -t mem 00:04:34.278 10:35:50 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:34.278 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.278 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.278 10:35:50 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32592084 kB' 'MemFree: 25517840 kB' 'MemUsed: 7074244 kB' 'SwapCached: 16 kB' 'Active: 3389924 kB' 'Inactive: 180704 kB' 'Active(anon): 3173304 kB' 'Inactive(anon): 16 kB' 'Active(file): 216620 kB' 'Inactive(file): 180688 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3354096 kB' 'Mapped: 107028 kB' 'AnonPages: 219904 kB' 'Shmem: 2956772 kB' 'KernelStack: 12280 kB' 'PageTables: 3952 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 124976 kB' 'Slab: 365204 kB' 'SReclaimable: 124976 kB' 'SUnreclaim: 240228 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:34.278 10:35:50 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.278 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.278 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.278 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.278 10:35:50 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.278 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.278 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.278 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.278 10:35:50 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.278 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.278 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.278 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.278 10:35:50 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.278 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.278 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.278 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.278 10:35:50 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.278 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.278 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.278 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.278 10:35:50 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.278 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.278 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.278 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.278 10:35:50 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.278 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.278 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.278 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.278 10:35:50 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.278 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.278 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.278 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.278 10:35:50 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.278 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.278 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.278 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.278 10:35:50 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.278 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.278 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.278 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.278 10:35:50 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.278 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.278 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.278 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.278 10:35:50 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.278 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.278 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.278 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.278 10:35:50 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.278 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.278 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.278 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.278 10:35:50 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.278 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.278 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.278 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.278 10:35:50 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.278 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.278 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.278 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.278 10:35:50 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.278 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.278 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.278 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.278 10:35:50 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.278 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.278 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.278 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.278 10:35:50 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.278 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.278 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.278 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.278 10:35:50 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.278 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.278 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.278 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.278 10:35:50 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.278 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.278 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.278 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.278 10:35:50 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.278 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.278 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.278 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.278 10:35:50 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.278 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.278 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.278 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.278 10:35:50 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.278 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.278 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.278 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.278 10:35:50 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.278 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.278 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.278 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.278 10:35:50 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.278 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.278 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.278 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.278 10:35:50 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.278 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.278 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.278 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.278 10:35:50 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.278 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.278 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.278 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.278 10:35:50 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.278 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.278 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.278 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.278 10:35:50 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.278 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.278 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.278 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.278 10:35:50 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.278 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.278 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.278 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.278 10:35:50 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.278 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.278 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.278 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.278 10:35:50 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.278 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.278 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.278 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.278 10:35:50 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.279 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.279 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.279 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.279 10:35:50 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.279 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.279 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.279 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.279 10:35:50 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.279 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.279 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.279 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.279 10:35:50 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.279 10:35:50 -- setup/common.sh@32 -- # continue 00:04:34.279 10:35:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:34.279 10:35:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:34.279 10:35:50 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.279 10:35:50 -- setup/common.sh@33 -- # echo 0 00:04:34.279 10:35:50 -- setup/common.sh@33 -- # return 0 00:04:34.279 10:35:50 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:34.279 10:35:50 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:34.279 10:35:50 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:34.279 10:35:50 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:34.279 10:35:50 -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:04:34.279 node0=1024 expecting 1024 00:04:34.279 10:35:50 -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:04:34.279 00:04:34.279 real 0m7.115s 00:04:34.279 user 0m2.592s 00:04:34.279 sys 0m4.642s 00:04:34.279 10:35:50 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:34.279 10:35:50 -- common/autotest_common.sh@10 -- # set +x 00:04:34.279 ************************************ 00:04:34.279 END TEST no_shrink_alloc 00:04:34.279 ************************************ 00:04:34.279 10:35:50 -- setup/hugepages.sh@217 -- # clear_hp 00:04:34.279 10:35:50 -- setup/hugepages.sh@37 -- # local node hp 00:04:34.279 10:35:50 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:04:34.279 10:35:50 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:34.279 10:35:50 -- setup/hugepages.sh@41 -- # echo 0 00:04:34.279 10:35:50 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:34.279 10:35:50 -- setup/hugepages.sh@41 -- # echo 0 00:04:34.279 10:35:50 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:04:34.279 10:35:50 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:34.279 10:35:50 -- setup/hugepages.sh@41 -- # echo 0 00:04:34.279 10:35:50 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:34.279 10:35:50 -- setup/hugepages.sh@41 -- # echo 0 00:04:34.279 10:35:50 -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:04:34.279 10:35:50 -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:04:34.279 00:04:34.279 real 0m25.901s 00:04:34.279 user 0m8.947s 00:04:34.279 sys 0m15.753s 00:04:34.279 10:35:50 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:34.279 10:35:50 -- common/autotest_common.sh@10 -- # set +x 00:04:34.279 ************************************ 00:04:34.279 END TEST hugepages 00:04:34.279 ************************************ 00:04:34.279 10:35:50 -- setup/test-setup.sh@14 -- # run_test driver /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/driver.sh 00:04:34.279 10:35:50 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:34.279 10:35:50 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:34.279 10:35:50 -- common/autotest_common.sh@10 -- # set +x 00:04:34.279 ************************************ 00:04:34.279 START TEST driver 00:04:34.279 ************************************ 00:04:34.279 10:35:50 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/driver.sh 00:04:34.279 * Looking for test storage... 00:04:34.279 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:04:34.279 10:35:50 -- setup/driver.sh@68 -- # setup reset 00:04:34.279 10:35:50 -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:34.279 10:35:50 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:39.553 10:35:55 -- setup/driver.sh@69 -- # run_test guess_driver guess_driver 00:04:39.553 10:35:55 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:39.553 10:35:55 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:39.553 10:35:55 -- common/autotest_common.sh@10 -- # set +x 00:04:39.553 ************************************ 00:04:39.553 START TEST guess_driver 00:04:39.553 ************************************ 00:04:39.553 10:35:55 -- common/autotest_common.sh@1104 -- # guess_driver 00:04:39.553 10:35:55 -- setup/driver.sh@46 -- # local driver setup_driver marker 00:04:39.553 10:35:55 -- setup/driver.sh@47 -- # local fail=0 00:04:39.553 10:35:55 -- setup/driver.sh@49 -- # pick_driver 00:04:39.553 10:35:55 -- setup/driver.sh@36 -- # vfio 00:04:39.553 10:35:55 -- setup/driver.sh@21 -- # local iommu_grups 00:04:39.553 10:35:55 -- setup/driver.sh@22 -- # local unsafe_vfio 00:04:39.553 10:35:55 -- setup/driver.sh@24 -- # [[ -e /sys/module/vfio/parameters/enable_unsafe_noiommu_mode ]] 00:04:39.553 10:35:55 -- setup/driver.sh@25 -- # unsafe_vfio=N 00:04:39.553 10:35:55 -- setup/driver.sh@27 -- # iommu_groups=(/sys/kernel/iommu_groups/*) 00:04:39.553 10:35:55 -- setup/driver.sh@29 -- # (( 176 > 0 )) 00:04:39.553 10:35:55 -- setup/driver.sh@30 -- # is_driver vfio_pci 00:04:39.553 10:35:55 -- setup/driver.sh@14 -- # mod vfio_pci 00:04:39.553 10:35:55 -- setup/driver.sh@12 -- # dep vfio_pci 00:04:39.553 10:35:55 -- setup/driver.sh@11 -- # modprobe --show-depends vfio_pci 00:04:39.553 10:35:55 -- setup/driver.sh@12 -- # [[ insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/virt/lib/irqbypass.ko.xz 00:04:39.553 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:04:39.553 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:04:39.553 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:04:39.553 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:04:39.553 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio_iommu_type1.ko.xz 00:04:39.553 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/pci/vfio-pci-core.ko.xz 00:04:39.553 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/pci/vfio-pci.ko.xz == *\.\k\o* ]] 00:04:39.553 10:35:55 -- setup/driver.sh@30 -- # return 0 00:04:39.553 10:35:55 -- setup/driver.sh@37 -- # echo vfio-pci 00:04:39.553 10:35:55 -- setup/driver.sh@49 -- # driver=vfio-pci 00:04:39.553 10:35:55 -- setup/driver.sh@51 -- # [[ vfio-pci == \N\o\ \v\a\l\i\d\ \d\r\i\v\e\r\ \f\o\u\n\d ]] 00:04:39.553 10:35:55 -- setup/driver.sh@56 -- # echo 'Looking for driver=vfio-pci' 00:04:39.553 Looking for driver=vfio-pci 00:04:39.553 10:35:55 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:39.553 10:35:55 -- setup/driver.sh@45 -- # setup output config 00:04:39.553 10:35:55 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:39.553 10:35:55 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:42.867 10:35:58 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:42.867 10:35:58 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:42.867 10:35:58 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:42.867 10:35:58 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:42.867 10:35:58 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:42.867 10:35:58 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:42.867 10:35:58 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:42.867 10:35:58 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:42.867 10:35:58 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:42.867 10:35:58 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:42.867 10:35:58 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:42.867 10:35:58 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:42.867 10:35:58 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:42.867 10:35:58 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:42.867 10:35:58 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:42.867 10:35:58 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:42.867 10:35:58 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:42.867 10:35:58 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:42.867 10:35:58 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:42.867 10:35:58 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:42.867 10:35:58 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:42.867 10:35:58 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:42.867 10:35:58 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:42.867 10:35:58 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:42.867 10:35:58 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:42.867 10:35:58 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:42.867 10:35:58 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:42.867 10:35:58 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:42.867 10:35:58 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:42.867 10:35:58 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:42.867 10:35:58 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:42.867 10:35:58 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:42.867 10:35:58 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:42.867 10:35:58 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:42.867 10:35:58 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:42.867 10:35:58 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:42.867 10:35:58 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:42.867 10:35:58 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:42.867 10:35:58 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:42.867 10:35:58 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:42.867 10:35:58 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:42.867 10:35:58 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:42.867 10:35:58 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:42.867 10:35:58 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:42.867 10:35:58 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:42.867 10:35:58 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:42.867 10:35:58 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:42.867 10:35:58 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:44.249 10:36:00 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:44.249 10:36:00 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:44.249 10:36:00 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:44.249 10:36:00 -- setup/driver.sh@64 -- # (( fail == 0 )) 00:04:44.249 10:36:00 -- setup/driver.sh@65 -- # setup reset 00:04:44.249 10:36:00 -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:44.249 10:36:00 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:49.525 00:04:49.525 real 0m9.411s 00:04:49.525 user 0m2.432s 00:04:49.525 sys 0m4.644s 00:04:49.525 10:36:04 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:49.525 10:36:04 -- common/autotest_common.sh@10 -- # set +x 00:04:49.525 ************************************ 00:04:49.525 END TEST guess_driver 00:04:49.525 ************************************ 00:04:49.525 00:04:49.525 real 0m14.405s 00:04:49.525 user 0m3.839s 00:04:49.525 sys 0m7.459s 00:04:49.525 10:36:04 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:49.525 10:36:04 -- common/autotest_common.sh@10 -- # set +x 00:04:49.525 ************************************ 00:04:49.525 END TEST driver 00:04:49.525 ************************************ 00:04:49.525 10:36:04 -- setup/test-setup.sh@15 -- # run_test devices /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/devices.sh 00:04:49.525 10:36:04 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:49.525 10:36:04 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:49.525 10:36:04 -- common/autotest_common.sh@10 -- # set +x 00:04:49.525 ************************************ 00:04:49.525 START TEST devices 00:04:49.525 ************************************ 00:04:49.525 10:36:04 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/devices.sh 00:04:49.525 * Looking for test storage... 00:04:49.525 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:04:49.525 10:36:05 -- setup/devices.sh@190 -- # trap cleanup EXIT 00:04:49.525 10:36:05 -- setup/devices.sh@192 -- # setup reset 00:04:49.525 10:36:05 -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:49.525 10:36:05 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:52.813 10:36:08 -- setup/devices.sh@194 -- # get_zoned_devs 00:04:52.813 10:36:08 -- common/autotest_common.sh@1654 -- # zoned_devs=() 00:04:52.813 10:36:08 -- common/autotest_common.sh@1654 -- # local -gA zoned_devs 00:04:52.813 10:36:08 -- common/autotest_common.sh@1655 -- # local nvme bdf 00:04:52.813 10:36:08 -- common/autotest_common.sh@1657 -- # for nvme in /sys/block/nvme* 00:04:52.813 10:36:08 -- common/autotest_common.sh@1658 -- # is_block_zoned nvme0n1 00:04:52.813 10:36:08 -- common/autotest_common.sh@1647 -- # local device=nvme0n1 00:04:52.813 10:36:08 -- common/autotest_common.sh@1649 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:04:52.813 10:36:08 -- common/autotest_common.sh@1650 -- # [[ none != none ]] 00:04:52.813 10:36:08 -- setup/devices.sh@196 -- # blocks=() 00:04:52.813 10:36:08 -- setup/devices.sh@196 -- # declare -a blocks 00:04:52.813 10:36:08 -- setup/devices.sh@197 -- # blocks_to_pci=() 00:04:52.813 10:36:08 -- setup/devices.sh@197 -- # declare -A blocks_to_pci 00:04:52.813 10:36:08 -- setup/devices.sh@198 -- # min_disk_size=3221225472 00:04:52.813 10:36:08 -- setup/devices.sh@200 -- # for block in "/sys/block/nvme"!(*c*) 00:04:52.813 10:36:08 -- setup/devices.sh@201 -- # ctrl=nvme0n1 00:04:52.813 10:36:08 -- setup/devices.sh@201 -- # ctrl=nvme0 00:04:52.813 10:36:08 -- setup/devices.sh@202 -- # pci=0000:d8:00.0 00:04:52.813 10:36:08 -- setup/devices.sh@203 -- # [[ '' == *\0\0\0\0\:\d\8\:\0\0\.\0* ]] 00:04:52.813 10:36:08 -- setup/devices.sh@204 -- # block_in_use nvme0n1 00:04:52.813 10:36:08 -- scripts/common.sh@380 -- # local block=nvme0n1 pt 00:04:52.813 10:36:08 -- scripts/common.sh@389 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/spdk-gpt.py nvme0n1 00:04:52.813 No valid GPT data, bailing 00:04:52.813 10:36:08 -- scripts/common.sh@393 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:04:52.813 10:36:08 -- scripts/common.sh@393 -- # pt= 00:04:52.813 10:36:08 -- scripts/common.sh@394 -- # return 1 00:04:52.813 10:36:08 -- setup/devices.sh@204 -- # sec_size_to_bytes nvme0n1 00:04:52.813 10:36:08 -- setup/common.sh@76 -- # local dev=nvme0n1 00:04:52.813 10:36:08 -- setup/common.sh@78 -- # [[ -e /sys/block/nvme0n1 ]] 00:04:52.813 10:36:08 -- setup/common.sh@80 -- # echo 1600321314816 00:04:52.813 10:36:08 -- setup/devices.sh@204 -- # (( 1600321314816 >= min_disk_size )) 00:04:52.813 10:36:08 -- setup/devices.sh@205 -- # blocks+=("${block##*/}") 00:04:52.813 10:36:08 -- setup/devices.sh@206 -- # blocks_to_pci["${block##*/}"]=0000:d8:00.0 00:04:52.813 10:36:08 -- setup/devices.sh@209 -- # (( 1 > 0 )) 00:04:52.813 10:36:08 -- setup/devices.sh@211 -- # declare -r test_disk=nvme0n1 00:04:52.813 10:36:08 -- setup/devices.sh@213 -- # run_test nvme_mount nvme_mount 00:04:52.813 10:36:08 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:04:52.813 10:36:08 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:04:52.813 10:36:08 -- common/autotest_common.sh@10 -- # set +x 00:04:52.813 ************************************ 00:04:52.813 START TEST nvme_mount 00:04:52.813 ************************************ 00:04:52.813 10:36:08 -- common/autotest_common.sh@1104 -- # nvme_mount 00:04:52.813 10:36:08 -- setup/devices.sh@95 -- # nvme_disk=nvme0n1 00:04:52.813 10:36:08 -- setup/devices.sh@96 -- # nvme_disk_p=nvme0n1p1 00:04:52.813 10:36:08 -- setup/devices.sh@97 -- # nvme_mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:52.813 10:36:08 -- setup/devices.sh@98 -- # nvme_dummy_test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:52.813 10:36:08 -- setup/devices.sh@101 -- # partition_drive nvme0n1 1 00:04:52.813 10:36:08 -- setup/common.sh@39 -- # local disk=nvme0n1 00:04:52.813 10:36:08 -- setup/common.sh@40 -- # local part_no=1 00:04:52.813 10:36:08 -- setup/common.sh@41 -- # local size=1073741824 00:04:52.813 10:36:08 -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:04:52.813 10:36:08 -- setup/common.sh@44 -- # parts=() 00:04:52.813 10:36:08 -- setup/common.sh@44 -- # local parts 00:04:52.813 10:36:08 -- setup/common.sh@46 -- # (( part = 1 )) 00:04:52.813 10:36:08 -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:52.813 10:36:08 -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:04:52.813 10:36:08 -- setup/common.sh@46 -- # (( part++ )) 00:04:52.813 10:36:08 -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:52.813 10:36:08 -- setup/common.sh@51 -- # (( size /= 512 )) 00:04:52.813 10:36:08 -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:04:52.813 10:36:08 -- setup/common.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 00:04:53.749 Creating new GPT entries in memory. 00:04:53.749 GPT data structures destroyed! You may now partition the disk using fdisk or 00:04:53.749 other utilities. 00:04:53.749 10:36:09 -- setup/common.sh@57 -- # (( part = 1 )) 00:04:53.749 10:36:09 -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:53.749 10:36:09 -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:04:53.749 10:36:09 -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:04:53.749 10:36:09 -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:04:54.685 Creating new GPT entries in memory. 00:04:54.685 The operation has completed successfully. 00:04:54.685 10:36:10 -- setup/common.sh@57 -- # (( part++ )) 00:04:54.685 10:36:10 -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:54.685 10:36:10 -- setup/common.sh@62 -- # wait 1950163 00:04:54.685 10:36:10 -- setup/devices.sh@102 -- # mkfs /dev/nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:54.685 10:36:10 -- setup/common.sh@66 -- # local dev=/dev/nvme0n1p1 mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount size= 00:04:54.685 10:36:10 -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:54.685 10:36:10 -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1p1 ]] 00:04:54.685 10:36:10 -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1p1 00:04:54.685 10:36:10 -- setup/common.sh@72 -- # mount /dev/nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:54.685 10:36:10 -- setup/devices.sh@105 -- # verify 0000:d8:00.0 nvme0n1:nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:54.685 10:36:10 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:04:54.685 10:36:10 -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1p1 00:04:54.685 10:36:10 -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:54.685 10:36:10 -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:54.685 10:36:10 -- setup/devices.sh@53 -- # local found=0 00:04:54.685 10:36:10 -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:54.685 10:36:10 -- setup/devices.sh@56 -- # : 00:04:54.685 10:36:10 -- setup/devices.sh@59 -- # local pci status 00:04:54.685 10:36:10 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:54.685 10:36:10 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:04:54.685 10:36:10 -- setup/devices.sh@47 -- # setup output config 00:04:54.685 10:36:10 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:54.685 10:36:10 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:57.978 10:36:13 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:57.978 10:36:13 -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1p1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1\p\1* ]] 00:04:57.978 10:36:13 -- setup/devices.sh@63 -- # found=1 00:04:57.978 10:36:13 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:57.978 10:36:13 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:57.978 10:36:13 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:57.978 10:36:13 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:57.978 10:36:13 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:57.978 10:36:13 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:57.978 10:36:13 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:57.978 10:36:14 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:57.978 10:36:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:57.978 10:36:14 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:57.978 10:36:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:57.978 10:36:14 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:57.978 10:36:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:57.978 10:36:14 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:57.978 10:36:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:57.978 10:36:14 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:57.978 10:36:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:57.978 10:36:14 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:57.978 10:36:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:57.978 10:36:14 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:57.978 10:36:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:57.979 10:36:14 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:57.979 10:36:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:57.979 10:36:14 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:57.979 10:36:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:57.979 10:36:14 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:57.979 10:36:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:57.979 10:36:14 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:57.979 10:36:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:57.979 10:36:14 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:57.979 10:36:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:57.979 10:36:14 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:57.979 10:36:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:57.979 10:36:14 -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:57.979 10:36:14 -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount ]] 00:04:57.979 10:36:14 -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:57.979 10:36:14 -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:57.979 10:36:14 -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:57.979 10:36:14 -- setup/devices.sh@110 -- # cleanup_nvme 00:04:57.979 10:36:14 -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:57.979 10:36:14 -- setup/devices.sh@21 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:57.979 10:36:14 -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:57.979 10:36:14 -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:04:57.979 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:04:57.979 10:36:14 -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:04:57.979 10:36:14 -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:04:58.239 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:04:58.239 /dev/nvme0n1: 8 bytes were erased at offset 0x1749a955e00 (gpt): 45 46 49 20 50 41 52 54 00:04:58.239 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:04:58.239 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:04:58.239 10:36:14 -- setup/devices.sh@113 -- # mkfs /dev/nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 1024M 00:04:58.239 10:36:14 -- setup/common.sh@66 -- # local dev=/dev/nvme0n1 mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount size=1024M 00:04:58.239 10:36:14 -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:58.239 10:36:14 -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1 ]] 00:04:58.239 10:36:14 -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1 1024M 00:04:58.239 10:36:14 -- setup/common.sh@72 -- # mount /dev/nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:58.239 10:36:14 -- setup/devices.sh@116 -- # verify 0000:d8:00.0 nvme0n1:nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:58.239 10:36:14 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:04:58.239 10:36:14 -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1 00:04:58.239 10:36:14 -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:58.239 10:36:14 -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:58.239 10:36:14 -- setup/devices.sh@53 -- # local found=0 00:04:58.239 10:36:14 -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:58.239 10:36:14 -- setup/devices.sh@56 -- # : 00:04:58.239 10:36:14 -- setup/devices.sh@59 -- # local pci status 00:04:58.239 10:36:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:58.239 10:36:14 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:04:58.239 10:36:14 -- setup/devices.sh@47 -- # setup output config 00:04:58.239 10:36:14 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:58.239 10:36:14 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:05:01.537 10:36:17 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:01.537 10:36:17 -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1* ]] 00:05:01.537 10:36:17 -- setup/devices.sh@63 -- # found=1 00:05:01.537 10:36:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:01.537 10:36:17 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:01.537 10:36:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:01.537 10:36:17 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:01.537 10:36:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:01.537 10:36:17 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:01.537 10:36:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:01.537 10:36:17 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:01.537 10:36:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:01.537 10:36:17 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:01.537 10:36:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:01.537 10:36:17 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:01.537 10:36:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:01.537 10:36:17 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:01.537 10:36:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:01.537 10:36:17 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:01.537 10:36:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:01.537 10:36:17 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:01.537 10:36:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:01.537 10:36:17 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:01.537 10:36:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:01.537 10:36:17 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:01.537 10:36:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:01.537 10:36:17 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:01.537 10:36:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:01.537 10:36:17 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:01.537 10:36:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:01.537 10:36:17 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:01.537 10:36:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:01.537 10:36:17 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:01.537 10:36:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:01.537 10:36:17 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:01.537 10:36:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:01.537 10:36:17 -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:01.537 10:36:17 -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount ]] 00:05:01.537 10:36:17 -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:01.537 10:36:17 -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:05:01.537 10:36:17 -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:01.537 10:36:17 -- setup/devices.sh@123 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:01.537 10:36:17 -- setup/devices.sh@125 -- # verify 0000:d8:00.0 data@nvme0n1 '' '' 00:05:01.537 10:36:17 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:05:01.537 10:36:17 -- setup/devices.sh@49 -- # local mounts=data@nvme0n1 00:05:01.537 10:36:17 -- setup/devices.sh@50 -- # local mount_point= 00:05:01.537 10:36:17 -- setup/devices.sh@51 -- # local test_file= 00:05:01.537 10:36:17 -- setup/devices.sh@53 -- # local found=0 00:05:01.537 10:36:17 -- setup/devices.sh@55 -- # [[ -n '' ]] 00:05:01.537 10:36:17 -- setup/devices.sh@59 -- # local pci status 00:05:01.537 10:36:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:01.537 10:36:17 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:05:01.537 10:36:17 -- setup/devices.sh@47 -- # setup output config 00:05:01.537 10:36:17 -- setup/common.sh@9 -- # [[ output == output ]] 00:05:01.537 10:36:17 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:05:04.858 10:36:20 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:04.858 10:36:20 -- setup/devices.sh@62 -- # [[ Active devices: data@nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\d\a\t\a\@\n\v\m\e\0\n\1* ]] 00:05:04.858 10:36:20 -- setup/devices.sh@63 -- # found=1 00:05:04.858 10:36:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:04.858 10:36:20 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:04.858 10:36:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:04.858 10:36:20 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:04.858 10:36:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:04.858 10:36:20 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:04.858 10:36:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:04.858 10:36:20 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:04.858 10:36:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:04.858 10:36:20 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:04.858 10:36:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:04.858 10:36:20 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:04.858 10:36:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:04.858 10:36:20 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:04.858 10:36:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:04.858 10:36:20 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:04.858 10:36:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:04.858 10:36:20 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:04.858 10:36:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:04.858 10:36:20 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:04.858 10:36:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:04.858 10:36:20 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:04.858 10:36:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:04.859 10:36:20 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:04.859 10:36:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:04.859 10:36:20 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:04.859 10:36:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:04.859 10:36:20 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:04.859 10:36:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:04.859 10:36:20 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:04.859 10:36:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:04.859 10:36:20 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:04.859 10:36:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:04.859 10:36:21 -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:04.859 10:36:21 -- setup/devices.sh@68 -- # [[ -n '' ]] 00:05:04.859 10:36:21 -- setup/devices.sh@68 -- # return 0 00:05:04.859 10:36:21 -- setup/devices.sh@128 -- # cleanup_nvme 00:05:04.859 10:36:21 -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:04.859 10:36:21 -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:04.859 10:36:21 -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:05:04.859 10:36:21 -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:05:04.859 /dev/nvme0n1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:05:04.859 00:05:04.859 real 0m12.306s 00:05:04.859 user 0m3.641s 00:05:04.859 sys 0m6.614s 00:05:04.859 10:36:21 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:04.859 10:36:21 -- common/autotest_common.sh@10 -- # set +x 00:05:04.859 ************************************ 00:05:04.859 END TEST nvme_mount 00:05:04.859 ************************************ 00:05:04.859 10:36:21 -- setup/devices.sh@214 -- # run_test dm_mount dm_mount 00:05:04.859 10:36:21 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:04.859 10:36:21 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:04.859 10:36:21 -- common/autotest_common.sh@10 -- # set +x 00:05:04.859 ************************************ 00:05:04.859 START TEST dm_mount 00:05:04.859 ************************************ 00:05:04.859 10:36:21 -- common/autotest_common.sh@1104 -- # dm_mount 00:05:04.859 10:36:21 -- setup/devices.sh@144 -- # pv=nvme0n1 00:05:04.859 10:36:21 -- setup/devices.sh@145 -- # pv0=nvme0n1p1 00:05:04.859 10:36:21 -- setup/devices.sh@146 -- # pv1=nvme0n1p2 00:05:04.859 10:36:21 -- setup/devices.sh@148 -- # partition_drive nvme0n1 00:05:04.859 10:36:21 -- setup/common.sh@39 -- # local disk=nvme0n1 00:05:04.859 10:36:21 -- setup/common.sh@40 -- # local part_no=2 00:05:04.859 10:36:21 -- setup/common.sh@41 -- # local size=1073741824 00:05:04.859 10:36:21 -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:05:04.859 10:36:21 -- setup/common.sh@44 -- # parts=() 00:05:04.859 10:36:21 -- setup/common.sh@44 -- # local parts 00:05:04.859 10:36:21 -- setup/common.sh@46 -- # (( part = 1 )) 00:05:04.859 10:36:21 -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:04.859 10:36:21 -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:05:04.859 10:36:21 -- setup/common.sh@46 -- # (( part++ )) 00:05:04.859 10:36:21 -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:04.859 10:36:21 -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:05:04.859 10:36:21 -- setup/common.sh@46 -- # (( part++ )) 00:05:04.859 10:36:21 -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:04.859 10:36:21 -- setup/common.sh@51 -- # (( size /= 512 )) 00:05:04.859 10:36:21 -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:05:04.859 10:36:21 -- setup/common.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 nvme0n1p2 00:05:05.800 Creating new GPT entries in memory. 00:05:05.800 GPT data structures destroyed! You may now partition the disk using fdisk or 00:05:05.800 other utilities. 00:05:05.800 10:36:22 -- setup/common.sh@57 -- # (( part = 1 )) 00:05:05.800 10:36:22 -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:05.800 10:36:22 -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:05:05.800 10:36:22 -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:05:05.800 10:36:22 -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:05:07.178 Creating new GPT entries in memory. 00:05:07.178 The operation has completed successfully. 00:05:07.178 10:36:23 -- setup/common.sh@57 -- # (( part++ )) 00:05:07.178 10:36:23 -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:07.178 10:36:23 -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:05:07.178 10:36:23 -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:05:07.178 10:36:23 -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=2:2099200:4196351 00:05:08.115 The operation has completed successfully. 00:05:08.115 10:36:24 -- setup/common.sh@57 -- # (( part++ )) 00:05:08.115 10:36:24 -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:08.115 10:36:24 -- setup/common.sh@62 -- # wait 1954619 00:05:08.116 10:36:24 -- setup/devices.sh@150 -- # dm_name=nvme_dm_test 00:05:08.116 10:36:24 -- setup/devices.sh@151 -- # dm_mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:08.116 10:36:24 -- setup/devices.sh@152 -- # dm_dummy_test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:08.116 10:36:24 -- setup/devices.sh@155 -- # dmsetup create nvme_dm_test 00:05:08.116 10:36:24 -- setup/devices.sh@160 -- # for t in {1..5} 00:05:08.116 10:36:24 -- setup/devices.sh@161 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:05:08.116 10:36:24 -- setup/devices.sh@161 -- # break 00:05:08.116 10:36:24 -- setup/devices.sh@164 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:05:08.116 10:36:24 -- setup/devices.sh@165 -- # readlink -f /dev/mapper/nvme_dm_test 00:05:08.116 10:36:24 -- setup/devices.sh@165 -- # dm=/dev/dm-0 00:05:08.116 10:36:24 -- setup/devices.sh@166 -- # dm=dm-0 00:05:08.116 10:36:24 -- setup/devices.sh@168 -- # [[ -e /sys/class/block/nvme0n1p1/holders/dm-0 ]] 00:05:08.116 10:36:24 -- setup/devices.sh@169 -- # [[ -e /sys/class/block/nvme0n1p2/holders/dm-0 ]] 00:05:08.116 10:36:24 -- setup/devices.sh@171 -- # mkfs /dev/mapper/nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:08.116 10:36:24 -- setup/common.sh@66 -- # local dev=/dev/mapper/nvme_dm_test mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount size= 00:05:08.116 10:36:24 -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:08.116 10:36:24 -- setup/common.sh@70 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:05:08.116 10:36:24 -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/mapper/nvme_dm_test 00:05:08.116 10:36:24 -- setup/common.sh@72 -- # mount /dev/mapper/nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:08.116 10:36:24 -- setup/devices.sh@174 -- # verify 0000:d8:00.0 nvme0n1:nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:08.116 10:36:24 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:05:08.116 10:36:24 -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme_dm_test 00:05:08.116 10:36:24 -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:08.116 10:36:24 -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:08.116 10:36:24 -- setup/devices.sh@53 -- # local found=0 00:05:08.116 10:36:24 -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:05:08.116 10:36:24 -- setup/devices.sh@56 -- # : 00:05:08.116 10:36:24 -- setup/devices.sh@59 -- # local pci status 00:05:08.116 10:36:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:08.116 10:36:24 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:05:08.116 10:36:24 -- setup/devices.sh@47 -- # setup output config 00:05:08.116 10:36:24 -- setup/common.sh@9 -- # [[ output == output ]] 00:05:08.116 10:36:24 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:05:11.408 10:36:27 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:11.408 10:36:27 -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0,mount@nvme0n1:nvme_dm_test, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\_\d\m\_\t\e\s\t* ]] 00:05:11.408 10:36:27 -- setup/devices.sh@63 -- # found=1 00:05:11.408 10:36:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:11.408 10:36:27 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:11.408 10:36:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:11.408 10:36:27 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:11.408 10:36:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:11.408 10:36:27 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:11.408 10:36:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:11.408 10:36:27 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:11.408 10:36:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:11.408 10:36:27 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:11.408 10:36:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:11.408 10:36:27 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:11.408 10:36:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:11.408 10:36:27 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:11.408 10:36:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:11.408 10:36:27 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:11.408 10:36:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:11.408 10:36:27 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:11.408 10:36:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:11.408 10:36:27 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:11.408 10:36:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:11.408 10:36:27 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:11.408 10:36:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:11.408 10:36:27 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:11.408 10:36:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:11.408 10:36:27 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:11.408 10:36:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:11.408 10:36:27 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:11.408 10:36:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:11.408 10:36:27 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:11.408 10:36:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:11.408 10:36:27 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:11.408 10:36:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:11.408 10:36:27 -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:11.408 10:36:27 -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount ]] 00:05:11.408 10:36:27 -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:11.408 10:36:27 -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:05:11.408 10:36:27 -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:11.667 10:36:27 -- setup/devices.sh@182 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:11.667 10:36:27 -- setup/devices.sh@184 -- # verify 0000:d8:00.0 holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 '' '' 00:05:11.667 10:36:27 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:05:11.667 10:36:27 -- setup/devices.sh@49 -- # local mounts=holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 00:05:11.667 10:36:27 -- setup/devices.sh@50 -- # local mount_point= 00:05:11.667 10:36:27 -- setup/devices.sh@51 -- # local test_file= 00:05:11.667 10:36:27 -- setup/devices.sh@53 -- # local found=0 00:05:11.667 10:36:27 -- setup/devices.sh@55 -- # [[ -n '' ]] 00:05:11.667 10:36:27 -- setup/devices.sh@59 -- # local pci status 00:05:11.667 10:36:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:11.667 10:36:27 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:05:11.667 10:36:27 -- setup/devices.sh@47 -- # setup output config 00:05:11.667 10:36:27 -- setup/common.sh@9 -- # [[ output == output ]] 00:05:11.667 10:36:27 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:05:14.960 10:36:30 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:14.960 10:36:30 -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\1\:\d\m\-\0\,\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\2\:\d\m\-\0* ]] 00:05:14.960 10:36:30 -- setup/devices.sh@63 -- # found=1 00:05:14.960 10:36:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:14.960 10:36:30 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:14.960 10:36:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:14.960 10:36:30 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:14.960 10:36:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:14.960 10:36:30 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:14.960 10:36:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:14.960 10:36:30 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:14.960 10:36:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:14.960 10:36:30 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:14.960 10:36:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:14.960 10:36:30 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:14.960 10:36:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:14.960 10:36:30 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:14.960 10:36:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:14.960 10:36:30 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:14.960 10:36:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:14.960 10:36:30 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:14.960 10:36:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:14.960 10:36:30 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:14.960 10:36:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:14.960 10:36:30 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:14.960 10:36:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:14.960 10:36:30 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:14.960 10:36:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:14.960 10:36:30 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:14.960 10:36:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:14.960 10:36:30 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:14.960 10:36:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:14.960 10:36:30 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:14.960 10:36:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:14.960 10:36:30 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:14.960 10:36:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:14.960 10:36:30 -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:14.960 10:36:30 -- setup/devices.sh@68 -- # [[ -n '' ]] 00:05:14.960 10:36:30 -- setup/devices.sh@68 -- # return 0 00:05:14.960 10:36:30 -- setup/devices.sh@187 -- # cleanup_dm 00:05:14.961 10:36:30 -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:14.961 10:36:30 -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:05:14.961 10:36:30 -- setup/devices.sh@37 -- # dmsetup remove --force nvme_dm_test 00:05:14.961 10:36:31 -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:14.961 10:36:31 -- setup/devices.sh@40 -- # wipefs --all /dev/nvme0n1p1 00:05:14.961 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:05:14.961 10:36:31 -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:05:14.961 10:36:31 -- setup/devices.sh@43 -- # wipefs --all /dev/nvme0n1p2 00:05:14.961 00:05:14.961 real 0m9.932s 00:05:14.961 user 0m2.462s 00:05:14.961 sys 0m4.505s 00:05:14.961 10:36:31 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:14.961 10:36:31 -- common/autotest_common.sh@10 -- # set +x 00:05:14.961 ************************************ 00:05:14.961 END TEST dm_mount 00:05:14.961 ************************************ 00:05:14.961 10:36:31 -- setup/devices.sh@1 -- # cleanup 00:05:14.961 10:36:31 -- setup/devices.sh@11 -- # cleanup_nvme 00:05:14.961 10:36:31 -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:14.961 10:36:31 -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:14.961 10:36:31 -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:05:14.961 10:36:31 -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:05:14.961 10:36:31 -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:05:15.220 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:05:15.220 /dev/nvme0n1: 8 bytes were erased at offset 0x1749a955e00 (gpt): 45 46 49 20 50 41 52 54 00:05:15.220 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:05:15.220 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:05:15.220 10:36:31 -- setup/devices.sh@12 -- # cleanup_dm 00:05:15.220 10:36:31 -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:15.220 10:36:31 -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:05:15.220 10:36:31 -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:15.220 10:36:31 -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:05:15.220 10:36:31 -- setup/devices.sh@14 -- # [[ -b /dev/nvme0n1 ]] 00:05:15.220 10:36:31 -- setup/devices.sh@15 -- # wipefs --all /dev/nvme0n1 00:05:15.220 00:05:15.220 real 0m26.435s 00:05:15.220 user 0m7.500s 00:05:15.220 sys 0m13.844s 00:05:15.220 10:36:31 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:15.220 10:36:31 -- common/autotest_common.sh@10 -- # set +x 00:05:15.220 ************************************ 00:05:15.220 END TEST devices 00:05:15.220 ************************************ 00:05:15.220 00:05:15.220 real 1m30.642s 00:05:15.220 user 0m27.850s 00:05:15.220 sys 0m51.648s 00:05:15.220 10:36:31 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:15.220 10:36:31 -- common/autotest_common.sh@10 -- # set +x 00:05:15.220 ************************************ 00:05:15.220 END TEST setup.sh 00:05:15.220 ************************************ 00:05:15.220 10:36:31 -- spdk/autotest.sh@139 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:05:18.508 Hugepages 00:05:18.508 node hugesize free / total 00:05:18.508 node0 1048576kB 0 / 0 00:05:18.508 node0 2048kB 2048 / 2048 00:05:18.508 node1 1048576kB 0 / 0 00:05:18.508 node1 2048kB 0 / 0 00:05:18.508 00:05:18.508 Type BDF Vendor Device NUMA Driver Device Block devices 00:05:18.508 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:05:18.508 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:05:18.508 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:05:18.508 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:05:18.508 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:05:18.508 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:05:18.508 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:05:18.508 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:05:18.508 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:05:18.508 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:05:18.508 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:05:18.508 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:05:18.508 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:05:18.508 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:05:18.508 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:05:18.508 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:05:18.767 NVMe 0000:d8:00.0 8086 0a54 1 nvme nvme0 nvme0n1 00:05:18.767 10:36:34 -- spdk/autotest.sh@141 -- # uname -s 00:05:18.767 10:36:34 -- spdk/autotest.sh@141 -- # [[ Linux == Linux ]] 00:05:18.767 10:36:34 -- spdk/autotest.sh@143 -- # nvme_namespace_revert 00:05:18.767 10:36:34 -- common/autotest_common.sh@1516 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:21.306 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:21.306 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:21.306 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:21.306 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:21.306 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:21.306 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:21.565 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:21.565 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:21.565 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:21.565 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:21.565 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:21.565 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:21.565 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:21.565 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:21.565 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:21.565 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:22.945 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:05:23.205 10:36:39 -- common/autotest_common.sh@1517 -- # sleep 1 00:05:24.309 10:36:40 -- common/autotest_common.sh@1518 -- # bdfs=() 00:05:24.309 10:36:40 -- common/autotest_common.sh@1518 -- # local bdfs 00:05:24.309 10:36:40 -- common/autotest_common.sh@1519 -- # bdfs=($(get_nvme_bdfs)) 00:05:24.309 10:36:40 -- common/autotest_common.sh@1519 -- # get_nvme_bdfs 00:05:24.309 10:36:40 -- common/autotest_common.sh@1498 -- # bdfs=() 00:05:24.309 10:36:40 -- common/autotest_common.sh@1498 -- # local bdfs 00:05:24.309 10:36:40 -- common/autotest_common.sh@1499 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:05:24.309 10:36:40 -- common/autotest_common.sh@1499 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/gen_nvme.sh 00:05:24.309 10:36:40 -- common/autotest_common.sh@1499 -- # jq -r '.config[].params.traddr' 00:05:24.309 10:36:40 -- common/autotest_common.sh@1500 -- # (( 1 == 0 )) 00:05:24.309 10:36:40 -- common/autotest_common.sh@1504 -- # printf '%s\n' 0000:d8:00.0 00:05:24.309 10:36:40 -- common/autotest_common.sh@1521 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:05:27.598 Waiting for block devices as requested 00:05:27.598 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:05:27.598 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:05:27.598 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:05:27.857 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:05:27.857 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:05:27.857 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:05:28.115 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:05:28.115 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:05:28.115 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:05:28.115 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:05:28.373 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:05:28.373 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:05:28.373 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:05:28.631 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:05:28.631 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:05:28.631 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:05:28.890 0000:d8:00.0 (8086 0a54): vfio-pci -> nvme 00:05:28.890 10:36:45 -- common/autotest_common.sh@1523 -- # for bdf in "${bdfs[@]}" 00:05:28.890 10:36:45 -- common/autotest_common.sh@1524 -- # get_nvme_ctrlr_from_bdf 0000:d8:00.0 00:05:28.890 10:36:45 -- common/autotest_common.sh@1487 -- # readlink -f /sys/class/nvme/nvme0 00:05:28.890 10:36:45 -- common/autotest_common.sh@1487 -- # grep 0000:d8:00.0/nvme/nvme 00:05:28.890 10:36:45 -- common/autotest_common.sh@1487 -- # bdf_sysfs_path=/sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 00:05:28.890 10:36:45 -- common/autotest_common.sh@1488 -- # [[ -z /sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 ]] 00:05:28.890 10:36:45 -- common/autotest_common.sh@1492 -- # basename /sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 00:05:28.890 10:36:45 -- common/autotest_common.sh@1492 -- # printf '%s\n' nvme0 00:05:28.890 10:36:45 -- common/autotest_common.sh@1524 -- # nvme_ctrlr=/dev/nvme0 00:05:28.890 10:36:45 -- common/autotest_common.sh@1525 -- # [[ -z /dev/nvme0 ]] 00:05:28.890 10:36:45 -- common/autotest_common.sh@1530 -- # nvme id-ctrl /dev/nvme0 00:05:28.890 10:36:45 -- common/autotest_common.sh@1530 -- # grep oacs 00:05:28.890 10:36:45 -- common/autotest_common.sh@1530 -- # cut -d: -f2 00:05:28.890 10:36:45 -- common/autotest_common.sh@1530 -- # oacs=' 0xe' 00:05:28.890 10:36:45 -- common/autotest_common.sh@1531 -- # oacs_ns_manage=8 00:05:28.890 10:36:45 -- common/autotest_common.sh@1533 -- # [[ 8 -ne 0 ]] 00:05:28.890 10:36:45 -- common/autotest_common.sh@1539 -- # nvme id-ctrl /dev/nvme0 00:05:28.890 10:36:45 -- common/autotest_common.sh@1539 -- # cut -d: -f2 00:05:28.890 10:36:45 -- common/autotest_common.sh@1539 -- # grep unvmcap 00:05:28.890 10:36:45 -- common/autotest_common.sh@1539 -- # unvmcap=' 0' 00:05:28.890 10:36:45 -- common/autotest_common.sh@1540 -- # [[ 0 -eq 0 ]] 00:05:28.890 10:36:45 -- common/autotest_common.sh@1542 -- # continue 00:05:28.890 10:36:45 -- spdk/autotest.sh@146 -- # timing_exit pre_cleanup 00:05:28.890 10:36:45 -- common/autotest_common.sh@718 -- # xtrace_disable 00:05:28.890 10:36:45 -- common/autotest_common.sh@10 -- # set +x 00:05:29.149 10:36:45 -- spdk/autotest.sh@149 -- # timing_enter afterboot 00:05:29.149 10:36:45 -- common/autotest_common.sh@712 -- # xtrace_disable 00:05:29.149 10:36:45 -- common/autotest_common.sh@10 -- # set +x 00:05:29.149 10:36:45 -- spdk/autotest.sh@150 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:32.442 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:32.442 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:32.442 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:32.442 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:32.442 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:32.442 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:32.442 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:32.442 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:32.442 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:32.442 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:32.442 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:32.442 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:32.442 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:32.442 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:32.442 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:32.442 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:34.348 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:05:34.348 10:36:50 -- spdk/autotest.sh@151 -- # timing_exit afterboot 00:05:34.348 10:36:50 -- common/autotest_common.sh@718 -- # xtrace_disable 00:05:34.348 10:36:50 -- common/autotest_common.sh@10 -- # set +x 00:05:34.348 10:36:50 -- spdk/autotest.sh@155 -- # opal_revert_cleanup 00:05:34.348 10:36:50 -- common/autotest_common.sh@1576 -- # mapfile -t bdfs 00:05:34.348 10:36:50 -- common/autotest_common.sh@1576 -- # get_nvme_bdfs_by_id 0x0a54 00:05:34.348 10:36:50 -- common/autotest_common.sh@1562 -- # bdfs=() 00:05:34.348 10:36:50 -- common/autotest_common.sh@1562 -- # local bdfs 00:05:34.348 10:36:50 -- common/autotest_common.sh@1564 -- # get_nvme_bdfs 00:05:34.348 10:36:50 -- common/autotest_common.sh@1498 -- # bdfs=() 00:05:34.348 10:36:50 -- common/autotest_common.sh@1498 -- # local bdfs 00:05:34.348 10:36:50 -- common/autotest_common.sh@1499 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:05:34.348 10:36:50 -- common/autotest_common.sh@1499 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/gen_nvme.sh 00:05:34.348 10:36:50 -- common/autotest_common.sh@1499 -- # jq -r '.config[].params.traddr' 00:05:34.348 10:36:50 -- common/autotest_common.sh@1500 -- # (( 1 == 0 )) 00:05:34.348 10:36:50 -- common/autotest_common.sh@1504 -- # printf '%s\n' 0000:d8:00.0 00:05:34.348 10:36:50 -- common/autotest_common.sh@1564 -- # for bdf in $(get_nvme_bdfs) 00:05:34.348 10:36:50 -- common/autotest_common.sh@1565 -- # cat /sys/bus/pci/devices/0000:d8:00.0/device 00:05:34.348 10:36:50 -- common/autotest_common.sh@1565 -- # device=0x0a54 00:05:34.348 10:36:50 -- common/autotest_common.sh@1566 -- # [[ 0x0a54 == \0\x\0\a\5\4 ]] 00:05:34.348 10:36:50 -- common/autotest_common.sh@1567 -- # bdfs+=($bdf) 00:05:34.348 10:36:50 -- common/autotest_common.sh@1571 -- # printf '%s\n' 0000:d8:00.0 00:05:34.348 10:36:50 -- common/autotest_common.sh@1577 -- # [[ -z 0000:d8:00.0 ]] 00:05:34.348 10:36:50 -- common/autotest_common.sh@1582 -- # spdk_tgt_pid=1964596 00:05:34.348 10:36:50 -- common/autotest_common.sh@1583 -- # waitforlisten 1964596 00:05:34.348 10:36:50 -- common/autotest_common.sh@1581 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:05:34.348 10:36:50 -- common/autotest_common.sh@819 -- # '[' -z 1964596 ']' 00:05:34.348 10:36:50 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:34.348 10:36:50 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:34.348 10:36:50 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:34.348 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:34.348 10:36:50 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:34.348 10:36:50 -- common/autotest_common.sh@10 -- # set +x 00:05:34.348 [2024-07-13 10:36:50.675356] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:05:34.348 [2024-07-13 10:36:50.675422] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1964596 ] 00:05:34.348 EAL: No free 2048 kB hugepages reported on node 1 00:05:34.607 [2024-07-13 10:36:50.744579] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:34.607 [2024-07-13 10:36:50.785015] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:34.607 [2024-07-13 10:36:50.785134] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:35.173 10:36:51 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:35.173 10:36:51 -- common/autotest_common.sh@852 -- # return 0 00:05:35.173 10:36:51 -- common/autotest_common.sh@1585 -- # bdf_id=0 00:05:35.173 10:36:51 -- common/autotest_common.sh@1586 -- # for bdf in "${bdfs[@]}" 00:05:35.173 10:36:51 -- common/autotest_common.sh@1587 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py bdev_nvme_attach_controller -b nvme0 -t pcie -a 0000:d8:00.0 00:05:38.461 nvme0n1 00:05:38.461 10:36:54 -- common/autotest_common.sh@1589 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py bdev_nvme_opal_revert -b nvme0 -p test 00:05:38.461 [2024-07-13 10:36:54.632939] vbdev_opal_rpc.c: 125:rpc_bdev_nvme_opal_revert: *ERROR*: nvme0 not support opal 00:05:38.461 request: 00:05:38.461 { 00:05:38.461 "nvme_ctrlr_name": "nvme0", 00:05:38.461 "password": "test", 00:05:38.461 "method": "bdev_nvme_opal_revert", 00:05:38.461 "req_id": 1 00:05:38.461 } 00:05:38.461 Got JSON-RPC error response 00:05:38.461 response: 00:05:38.461 { 00:05:38.461 "code": -32602, 00:05:38.461 "message": "Invalid parameters" 00:05:38.461 } 00:05:38.461 10:36:54 -- common/autotest_common.sh@1589 -- # true 00:05:38.461 10:36:54 -- common/autotest_common.sh@1590 -- # (( ++bdf_id )) 00:05:38.461 10:36:54 -- common/autotest_common.sh@1593 -- # killprocess 1964596 00:05:38.461 10:36:54 -- common/autotest_common.sh@926 -- # '[' -z 1964596 ']' 00:05:38.461 10:36:54 -- common/autotest_common.sh@930 -- # kill -0 1964596 00:05:38.461 10:36:54 -- common/autotest_common.sh@931 -- # uname 00:05:38.461 10:36:54 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:05:38.461 10:36:54 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 1964596 00:05:38.461 10:36:54 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:05:38.461 10:36:54 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:05:38.461 10:36:54 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 1964596' 00:05:38.461 killing process with pid 1964596 00:05:38.461 10:36:54 -- common/autotest_common.sh@945 -- # kill 1964596 00:05:38.461 10:36:54 -- common/autotest_common.sh@950 -- # wait 1964596 00:05:40.997 10:36:56 -- spdk/autotest.sh@161 -- # '[' 0 -eq 1 ']' 00:05:40.997 10:36:56 -- spdk/autotest.sh@165 -- # '[' 1 -eq 1 ']' 00:05:40.997 10:36:56 -- spdk/autotest.sh@166 -- # [[ 0 -eq 1 ]] 00:05:40.997 10:36:56 -- spdk/autotest.sh@166 -- # [[ 0 -eq 1 ]] 00:05:40.997 10:36:56 -- spdk/autotest.sh@173 -- # timing_enter lib 00:05:40.997 10:36:56 -- common/autotest_common.sh@712 -- # xtrace_disable 00:05:40.997 10:36:56 -- common/autotest_common.sh@10 -- # set +x 00:05:40.997 10:36:56 -- spdk/autotest.sh@175 -- # run_test env /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env.sh 00:05:40.997 10:36:56 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:40.997 10:36:56 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:40.997 10:36:56 -- common/autotest_common.sh@10 -- # set +x 00:05:40.997 ************************************ 00:05:40.997 START TEST env 00:05:40.997 ************************************ 00:05:40.997 10:36:56 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env.sh 00:05:40.997 * Looking for test storage... 00:05:40.997 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env 00:05:40.997 10:36:56 -- env/env.sh@10 -- # run_test env_memory /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/memory/memory_ut 00:05:40.997 10:36:56 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:40.997 10:36:56 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:40.997 10:36:56 -- common/autotest_common.sh@10 -- # set +x 00:05:40.997 ************************************ 00:05:40.997 START TEST env_memory 00:05:40.997 ************************************ 00:05:40.997 10:36:56 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/memory/memory_ut 00:05:40.997 00:05:40.997 00:05:40.997 CUnit - A unit testing framework for C - Version 2.1-3 00:05:40.997 http://cunit.sourceforge.net/ 00:05:40.997 00:05:40.997 00:05:40.997 Suite: memory 00:05:40.997 Test: alloc and free memory map ...[2024-07-13 10:36:57.004559] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 283:spdk_mem_map_alloc: *ERROR*: Initial mem_map notify failed 00:05:40.997 passed 00:05:40.997 Test: mem map translation ...[2024-07-13 10:36:57.018877] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 591:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=2097152 len=1234 00:05:40.997 [2024-07-13 10:36:57.018895] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 591:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=1234 len=2097152 00:05:40.997 [2024-07-13 10:36:57.018925] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 584:spdk_mem_map_set_translation: *ERROR*: invalid usermode virtual address 281474976710656 00:05:40.997 [2024-07-13 10:36:57.018933] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 600:spdk_mem_map_set_translation: *ERROR*: could not get 0xffffffe00000 map 00:05:40.997 passed 00:05:40.997 Test: mem map registration ...[2024-07-13 10:36:57.041461] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 347:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x200000 len=1234 00:05:40.997 [2024-07-13 10:36:57.041480] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 347:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x4d2 len=2097152 00:05:40.997 passed 00:05:40.997 Test: mem map adjacent registrations ...passed 00:05:40.997 00:05:40.997 Run Summary: Type Total Ran Passed Failed Inactive 00:05:40.997 suites 1 1 n/a 0 0 00:05:40.997 tests 4 4 4 0 0 00:05:40.997 asserts 152 152 152 0 n/a 00:05:40.997 00:05:40.997 Elapsed time = 0.090 seconds 00:05:40.997 00:05:40.997 real 0m0.102s 00:05:40.997 user 0m0.091s 00:05:40.997 sys 0m0.011s 00:05:40.997 10:36:57 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:40.997 10:36:57 -- common/autotest_common.sh@10 -- # set +x 00:05:40.997 ************************************ 00:05:40.997 END TEST env_memory 00:05:40.998 ************************************ 00:05:40.998 10:36:57 -- env/env.sh@11 -- # run_test env_vtophys /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/vtophys/vtophys 00:05:40.998 10:36:57 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:40.998 10:36:57 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:40.998 10:36:57 -- common/autotest_common.sh@10 -- # set +x 00:05:40.998 ************************************ 00:05:40.998 START TEST env_vtophys 00:05:40.998 ************************************ 00:05:40.998 10:36:57 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/vtophys/vtophys 00:05:40.998 EAL: lib.eal log level changed from notice to debug 00:05:40.998 EAL: Detected lcore 0 as core 0 on socket 0 00:05:40.998 EAL: Detected lcore 1 as core 1 on socket 0 00:05:40.998 EAL: Detected lcore 2 as core 2 on socket 0 00:05:40.998 EAL: Detected lcore 3 as core 3 on socket 0 00:05:40.998 EAL: Detected lcore 4 as core 4 on socket 0 00:05:40.998 EAL: Detected lcore 5 as core 5 on socket 0 00:05:40.998 EAL: Detected lcore 6 as core 6 on socket 0 00:05:40.998 EAL: Detected lcore 7 as core 8 on socket 0 00:05:40.998 EAL: Detected lcore 8 as core 9 on socket 0 00:05:40.998 EAL: Detected lcore 9 as core 10 on socket 0 00:05:40.998 EAL: Detected lcore 10 as core 11 on socket 0 00:05:40.998 EAL: Detected lcore 11 as core 12 on socket 0 00:05:40.998 EAL: Detected lcore 12 as core 13 on socket 0 00:05:40.998 EAL: Detected lcore 13 as core 14 on socket 0 00:05:40.998 EAL: Detected lcore 14 as core 16 on socket 0 00:05:40.998 EAL: Detected lcore 15 as core 17 on socket 0 00:05:40.998 EAL: Detected lcore 16 as core 18 on socket 0 00:05:40.998 EAL: Detected lcore 17 as core 19 on socket 0 00:05:40.998 EAL: Detected lcore 18 as core 20 on socket 0 00:05:40.998 EAL: Detected lcore 19 as core 21 on socket 0 00:05:40.998 EAL: Detected lcore 20 as core 22 on socket 0 00:05:40.998 EAL: Detected lcore 21 as core 24 on socket 0 00:05:40.998 EAL: Detected lcore 22 as core 25 on socket 0 00:05:40.998 EAL: Detected lcore 23 as core 26 on socket 0 00:05:40.998 EAL: Detected lcore 24 as core 27 on socket 0 00:05:40.998 EAL: Detected lcore 25 as core 28 on socket 0 00:05:40.998 EAL: Detected lcore 26 as core 29 on socket 0 00:05:40.998 EAL: Detected lcore 27 as core 30 on socket 0 00:05:40.998 EAL: Detected lcore 28 as core 0 on socket 1 00:05:40.998 EAL: Detected lcore 29 as core 1 on socket 1 00:05:40.998 EAL: Detected lcore 30 as core 2 on socket 1 00:05:40.998 EAL: Detected lcore 31 as core 3 on socket 1 00:05:40.998 EAL: Detected lcore 32 as core 4 on socket 1 00:05:40.998 EAL: Detected lcore 33 as core 5 on socket 1 00:05:40.998 EAL: Detected lcore 34 as core 6 on socket 1 00:05:40.998 EAL: Detected lcore 35 as core 8 on socket 1 00:05:40.998 EAL: Detected lcore 36 as core 9 on socket 1 00:05:40.998 EAL: Detected lcore 37 as core 10 on socket 1 00:05:40.998 EAL: Detected lcore 38 as core 11 on socket 1 00:05:40.998 EAL: Detected lcore 39 as core 12 on socket 1 00:05:40.998 EAL: Detected lcore 40 as core 13 on socket 1 00:05:40.998 EAL: Detected lcore 41 as core 14 on socket 1 00:05:40.998 EAL: Detected lcore 42 as core 16 on socket 1 00:05:40.998 EAL: Detected lcore 43 as core 17 on socket 1 00:05:40.998 EAL: Detected lcore 44 as core 18 on socket 1 00:05:40.998 EAL: Detected lcore 45 as core 19 on socket 1 00:05:40.998 EAL: Detected lcore 46 as core 20 on socket 1 00:05:40.998 EAL: Detected lcore 47 as core 21 on socket 1 00:05:40.998 EAL: Detected lcore 48 as core 22 on socket 1 00:05:40.998 EAL: Detected lcore 49 as core 24 on socket 1 00:05:40.998 EAL: Detected lcore 50 as core 25 on socket 1 00:05:40.998 EAL: Detected lcore 51 as core 26 on socket 1 00:05:40.998 EAL: Detected lcore 52 as core 27 on socket 1 00:05:40.998 EAL: Detected lcore 53 as core 28 on socket 1 00:05:40.998 EAL: Detected lcore 54 as core 29 on socket 1 00:05:40.998 EAL: Detected lcore 55 as core 30 on socket 1 00:05:40.998 EAL: Detected lcore 56 as core 0 on socket 0 00:05:40.998 EAL: Detected lcore 57 as core 1 on socket 0 00:05:40.998 EAL: Detected lcore 58 as core 2 on socket 0 00:05:40.998 EAL: Detected lcore 59 as core 3 on socket 0 00:05:40.998 EAL: Detected lcore 60 as core 4 on socket 0 00:05:40.998 EAL: Detected lcore 61 as core 5 on socket 0 00:05:40.998 EAL: Detected lcore 62 as core 6 on socket 0 00:05:40.998 EAL: Detected lcore 63 as core 8 on socket 0 00:05:40.998 EAL: Detected lcore 64 as core 9 on socket 0 00:05:40.999 EAL: Detected lcore 65 as core 10 on socket 0 00:05:40.999 EAL: Detected lcore 66 as core 11 on socket 0 00:05:40.999 EAL: Detected lcore 67 as core 12 on socket 0 00:05:40.999 EAL: Detected lcore 68 as core 13 on socket 0 00:05:40.999 EAL: Detected lcore 69 as core 14 on socket 0 00:05:40.999 EAL: Detected lcore 70 as core 16 on socket 0 00:05:40.999 EAL: Detected lcore 71 as core 17 on socket 0 00:05:40.999 EAL: Detected lcore 72 as core 18 on socket 0 00:05:40.999 EAL: Detected lcore 73 as core 19 on socket 0 00:05:40.999 EAL: Detected lcore 74 as core 20 on socket 0 00:05:40.999 EAL: Detected lcore 75 as core 21 on socket 0 00:05:40.999 EAL: Detected lcore 76 as core 22 on socket 0 00:05:40.999 EAL: Detected lcore 77 as core 24 on socket 0 00:05:40.999 EAL: Detected lcore 78 as core 25 on socket 0 00:05:40.999 EAL: Detected lcore 79 as core 26 on socket 0 00:05:40.999 EAL: Detected lcore 80 as core 27 on socket 0 00:05:40.999 EAL: Detected lcore 81 as core 28 on socket 0 00:05:40.999 EAL: Detected lcore 82 as core 29 on socket 0 00:05:40.999 EAL: Detected lcore 83 as core 30 on socket 0 00:05:40.999 EAL: Detected lcore 84 as core 0 on socket 1 00:05:40.999 EAL: Detected lcore 85 as core 1 on socket 1 00:05:40.999 EAL: Detected lcore 86 as core 2 on socket 1 00:05:40.999 EAL: Detected lcore 87 as core 3 on socket 1 00:05:40.999 EAL: Detected lcore 88 as core 4 on socket 1 00:05:40.999 EAL: Detected lcore 89 as core 5 on socket 1 00:05:40.999 EAL: Detected lcore 90 as core 6 on socket 1 00:05:40.999 EAL: Detected lcore 91 as core 8 on socket 1 00:05:40.999 EAL: Detected lcore 92 as core 9 on socket 1 00:05:40.999 EAL: Detected lcore 93 as core 10 on socket 1 00:05:40.999 EAL: Detected lcore 94 as core 11 on socket 1 00:05:40.999 EAL: Detected lcore 95 as core 12 on socket 1 00:05:40.999 EAL: Detected lcore 96 as core 13 on socket 1 00:05:40.999 EAL: Detected lcore 97 as core 14 on socket 1 00:05:40.999 EAL: Detected lcore 98 as core 16 on socket 1 00:05:40.999 EAL: Detected lcore 99 as core 17 on socket 1 00:05:40.999 EAL: Detected lcore 100 as core 18 on socket 1 00:05:40.999 EAL: Detected lcore 101 as core 19 on socket 1 00:05:40.999 EAL: Detected lcore 102 as core 20 on socket 1 00:05:40.999 EAL: Detected lcore 103 as core 21 on socket 1 00:05:40.999 EAL: Detected lcore 104 as core 22 on socket 1 00:05:40.999 EAL: Detected lcore 105 as core 24 on socket 1 00:05:40.999 EAL: Detected lcore 106 as core 25 on socket 1 00:05:40.999 EAL: Detected lcore 107 as core 26 on socket 1 00:05:40.999 EAL: Detected lcore 108 as core 27 on socket 1 00:05:40.999 EAL: Detected lcore 109 as core 28 on socket 1 00:05:40.999 EAL: Detected lcore 110 as core 29 on socket 1 00:05:40.999 EAL: Detected lcore 111 as core 30 on socket 1 00:05:40.999 EAL: Maximum logical cores by configuration: 128 00:05:40.999 EAL: Detected CPU lcores: 112 00:05:40.999 EAL: Detected NUMA nodes: 2 00:05:40.999 EAL: Checking presence of .so 'librte_eal.so.24.0' 00:05:40.999 EAL: Checking presence of .so 'librte_eal.so.24' 00:05:40.999 EAL: Checking presence of .so 'librte_eal.so' 00:05:40.999 EAL: Detected static linkage of DPDK 00:05:40.999 EAL: No shared files mode enabled, IPC will be disabled 00:05:40.999 EAL: Bus pci wants IOVA as 'DC' 00:05:40.999 EAL: Buses did not request a specific IOVA mode. 00:05:40.999 EAL: IOMMU is available, selecting IOVA as VA mode. 00:05:40.999 EAL: Selected IOVA mode 'VA' 00:05:40.999 EAL: No free 2048 kB hugepages reported on node 1 00:05:40.999 EAL: Probing VFIO support... 00:05:40.999 EAL: IOMMU type 1 (Type 1) is supported 00:05:40.999 EAL: IOMMU type 7 (sPAPR) is not supported 00:05:40.999 EAL: IOMMU type 8 (No-IOMMU) is not supported 00:05:40.999 EAL: VFIO support initialized 00:05:40.999 EAL: Ask a virtual area of 0x2e000 bytes 00:05:40.999 EAL: Virtual area found at 0x200000000000 (size = 0x2e000) 00:05:41.000 EAL: Setting up physically contiguous memory... 00:05:41.000 EAL: Setting maximum number of open files to 524288 00:05:41.000 EAL: Detected memory type: socket_id:0 hugepage_sz:2097152 00:05:41.000 EAL: Detected memory type: socket_id:1 hugepage_sz:2097152 00:05:41.000 EAL: Creating 4 segment lists: n_segs:8192 socket_id:0 hugepage_sz:2097152 00:05:41.000 EAL: Ask a virtual area of 0x61000 bytes 00:05:41.000 EAL: Virtual area found at 0x20000002e000 (size = 0x61000) 00:05:41.000 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:41.000 EAL: Ask a virtual area of 0x400000000 bytes 00:05:41.000 EAL: Virtual area found at 0x200000200000 (size = 0x400000000) 00:05:41.000 EAL: VA reserved for memseg list at 0x200000200000, size 400000000 00:05:41.000 EAL: Ask a virtual area of 0x61000 bytes 00:05:41.000 EAL: Virtual area found at 0x200400200000 (size = 0x61000) 00:05:41.000 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:41.000 EAL: Ask a virtual area of 0x400000000 bytes 00:05:41.000 EAL: Virtual area found at 0x200400400000 (size = 0x400000000) 00:05:41.000 EAL: VA reserved for memseg list at 0x200400400000, size 400000000 00:05:41.000 EAL: Ask a virtual area of 0x61000 bytes 00:05:41.000 EAL: Virtual area found at 0x200800400000 (size = 0x61000) 00:05:41.000 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:41.000 EAL: Ask a virtual area of 0x400000000 bytes 00:05:41.000 EAL: Virtual area found at 0x200800600000 (size = 0x400000000) 00:05:41.000 EAL: VA reserved for memseg list at 0x200800600000, size 400000000 00:05:41.000 EAL: Ask a virtual area of 0x61000 bytes 00:05:41.000 EAL: Virtual area found at 0x200c00600000 (size = 0x61000) 00:05:41.000 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:41.000 EAL: Ask a virtual area of 0x400000000 bytes 00:05:41.000 EAL: Virtual area found at 0x200c00800000 (size = 0x400000000) 00:05:41.000 EAL: VA reserved for memseg list at 0x200c00800000, size 400000000 00:05:41.000 EAL: Creating 4 segment lists: n_segs:8192 socket_id:1 hugepage_sz:2097152 00:05:41.000 EAL: Ask a virtual area of 0x61000 bytes 00:05:41.000 EAL: Virtual area found at 0x201000800000 (size = 0x61000) 00:05:41.000 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:41.000 EAL: Ask a virtual area of 0x400000000 bytes 00:05:41.000 EAL: Virtual area found at 0x201000a00000 (size = 0x400000000) 00:05:41.000 EAL: VA reserved for memseg list at 0x201000a00000, size 400000000 00:05:41.000 EAL: Ask a virtual area of 0x61000 bytes 00:05:41.000 EAL: Virtual area found at 0x201400a00000 (size = 0x61000) 00:05:41.000 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:41.000 EAL: Ask a virtual area of 0x400000000 bytes 00:05:41.000 EAL: Virtual area found at 0x201400c00000 (size = 0x400000000) 00:05:41.000 EAL: VA reserved for memseg list at 0x201400c00000, size 400000000 00:05:41.000 EAL: Ask a virtual area of 0x61000 bytes 00:05:41.000 EAL: Virtual area found at 0x201800c00000 (size = 0x61000) 00:05:41.000 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:41.000 EAL: Ask a virtual area of 0x400000000 bytes 00:05:41.000 EAL: Virtual area found at 0x201800e00000 (size = 0x400000000) 00:05:41.000 EAL: VA reserved for memseg list at 0x201800e00000, size 400000000 00:05:41.000 EAL: Ask a virtual area of 0x61000 bytes 00:05:41.000 EAL: Virtual area found at 0x201c00e00000 (size = 0x61000) 00:05:41.000 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:41.000 EAL: Ask a virtual area of 0x400000000 bytes 00:05:41.000 EAL: Virtual area found at 0x201c01000000 (size = 0x400000000) 00:05:41.000 EAL: VA reserved for memseg list at 0x201c01000000, size 400000000 00:05:41.000 EAL: Hugepages will be freed exactly as allocated. 00:05:41.000 EAL: No shared files mode enabled, IPC is disabled 00:05:41.000 EAL: No shared files mode enabled, IPC is disabled 00:05:41.000 EAL: TSC frequency is ~2500000 KHz 00:05:41.000 EAL: Main lcore 0 is ready (tid=7f248e23ea00;cpuset=[0]) 00:05:41.000 EAL: Trying to obtain current memory policy. 00:05:41.000 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:41.000 EAL: Restoring previous memory policy: 0 00:05:41.000 EAL: request: mp_malloc_sync 00:05:41.000 EAL: No shared files mode enabled, IPC is disabled 00:05:41.000 EAL: Heap on socket 0 was expanded by 2MB 00:05:41.000 EAL: No shared files mode enabled, IPC is disabled 00:05:41.000 EAL: Mem event callback 'spdk:(nil)' registered 00:05:41.000 00:05:41.000 00:05:41.000 CUnit - A unit testing framework for C - Version 2.1-3 00:05:41.000 http://cunit.sourceforge.net/ 00:05:41.000 00:05:41.000 00:05:41.000 Suite: components_suite 00:05:41.000 Test: vtophys_malloc_test ...passed 00:05:41.000 Test: vtophys_spdk_malloc_test ...EAL: Trying to obtain current memory policy. 00:05:41.000 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:41.001 EAL: Restoring previous memory policy: 4 00:05:41.001 EAL: Calling mem event callback 'spdk:(nil)' 00:05:41.001 EAL: request: mp_malloc_sync 00:05:41.001 EAL: No shared files mode enabled, IPC is disabled 00:05:41.001 EAL: Heap on socket 0 was expanded by 4MB 00:05:41.001 EAL: Calling mem event callback 'spdk:(nil)' 00:05:41.001 EAL: request: mp_malloc_sync 00:05:41.001 EAL: No shared files mode enabled, IPC is disabled 00:05:41.001 EAL: Heap on socket 0 was shrunk by 4MB 00:05:41.001 EAL: Trying to obtain current memory policy. 00:05:41.001 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:41.001 EAL: Restoring previous memory policy: 4 00:05:41.001 EAL: Calling mem event callback 'spdk:(nil)' 00:05:41.001 EAL: request: mp_malloc_sync 00:05:41.001 EAL: No shared files mode enabled, IPC is disabled 00:05:41.001 EAL: Heap on socket 0 was expanded by 6MB 00:05:41.001 EAL: Calling mem event callback 'spdk:(nil)' 00:05:41.001 EAL: request: mp_malloc_sync 00:05:41.001 EAL: No shared files mode enabled, IPC is disabled 00:05:41.001 EAL: Heap on socket 0 was shrunk by 6MB 00:05:41.001 EAL: Trying to obtain current memory policy. 00:05:41.001 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:41.001 EAL: Restoring previous memory policy: 4 00:05:41.001 EAL: Calling mem event callback 'spdk:(nil)' 00:05:41.001 EAL: request: mp_malloc_sync 00:05:41.001 EAL: No shared files mode enabled, IPC is disabled 00:05:41.001 EAL: Heap on socket 0 was expanded by 10MB 00:05:41.001 EAL: Calling mem event callback 'spdk:(nil)' 00:05:41.001 EAL: request: mp_malloc_sync 00:05:41.001 EAL: No shared files mode enabled, IPC is disabled 00:05:41.001 EAL: Heap on socket 0 was shrunk by 10MB 00:05:41.001 EAL: Trying to obtain current memory policy. 00:05:41.001 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:41.001 EAL: Restoring previous memory policy: 4 00:05:41.001 EAL: Calling mem event callback 'spdk:(nil)' 00:05:41.001 EAL: request: mp_malloc_sync 00:05:41.001 EAL: No shared files mode enabled, IPC is disabled 00:05:41.001 EAL: Heap on socket 0 was expanded by 18MB 00:05:41.001 EAL: Calling mem event callback 'spdk:(nil)' 00:05:41.001 EAL: request: mp_malloc_sync 00:05:41.001 EAL: No shared files mode enabled, IPC is disabled 00:05:41.001 EAL: Heap on socket 0 was shrunk by 18MB 00:05:41.001 EAL: Trying to obtain current memory policy. 00:05:41.001 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:41.001 EAL: Restoring previous memory policy: 4 00:05:41.001 EAL: Calling mem event callback 'spdk:(nil)' 00:05:41.001 EAL: request: mp_malloc_sync 00:05:41.001 EAL: No shared files mode enabled, IPC is disabled 00:05:41.001 EAL: Heap on socket 0 was expanded by 34MB 00:05:41.001 EAL: Calling mem event callback 'spdk:(nil)' 00:05:41.001 EAL: request: mp_malloc_sync 00:05:41.001 EAL: No shared files mode enabled, IPC is disabled 00:05:41.001 EAL: Heap on socket 0 was shrunk by 34MB 00:05:41.001 EAL: Trying to obtain current memory policy. 00:05:41.001 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:41.001 EAL: Restoring previous memory policy: 4 00:05:41.001 EAL: Calling mem event callback 'spdk:(nil)' 00:05:41.001 EAL: request: mp_malloc_sync 00:05:41.001 EAL: No shared files mode enabled, IPC is disabled 00:05:41.001 EAL: Heap on socket 0 was expanded by 66MB 00:05:41.001 EAL: Calling mem event callback 'spdk:(nil)' 00:05:41.001 EAL: request: mp_malloc_sync 00:05:41.001 EAL: No shared files mode enabled, IPC is disabled 00:05:41.001 EAL: Heap on socket 0 was shrunk by 66MB 00:05:41.001 EAL: Trying to obtain current memory policy. 00:05:41.001 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:41.001 EAL: Restoring previous memory policy: 4 00:05:41.001 EAL: Calling mem event callback 'spdk:(nil)' 00:05:41.001 EAL: request: mp_malloc_sync 00:05:41.001 EAL: No shared files mode enabled, IPC is disabled 00:05:41.001 EAL: Heap on socket 0 was expanded by 130MB 00:05:41.002 EAL: Calling mem event callback 'spdk:(nil)' 00:05:41.002 EAL: request: mp_malloc_sync 00:05:41.002 EAL: No shared files mode enabled, IPC is disabled 00:05:41.002 EAL: Heap on socket 0 was shrunk by 130MB 00:05:41.002 EAL: Trying to obtain current memory policy. 00:05:41.002 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:41.002 EAL: Restoring previous memory policy: 4 00:05:41.002 EAL: Calling mem event callback 'spdk:(nil)' 00:05:41.002 EAL: request: mp_malloc_sync 00:05:41.002 EAL: No shared files mode enabled, IPC is disabled 00:05:41.002 EAL: Heap on socket 0 was expanded by 258MB 00:05:41.002 EAL: Calling mem event callback 'spdk:(nil)' 00:05:41.261 EAL: request: mp_malloc_sync 00:05:41.261 EAL: No shared files mode enabled, IPC is disabled 00:05:41.261 EAL: Heap on socket 0 was shrunk by 258MB 00:05:41.261 EAL: Trying to obtain current memory policy. 00:05:41.261 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:41.261 EAL: Restoring previous memory policy: 4 00:05:41.261 EAL: Calling mem event callback 'spdk:(nil)' 00:05:41.261 EAL: request: mp_malloc_sync 00:05:41.261 EAL: No shared files mode enabled, IPC is disabled 00:05:41.261 EAL: Heap on socket 0 was expanded by 514MB 00:05:41.261 EAL: Calling mem event callback 'spdk:(nil)' 00:05:41.521 EAL: request: mp_malloc_sync 00:05:41.521 EAL: No shared files mode enabled, IPC is disabled 00:05:41.521 EAL: Heap on socket 0 was shrunk by 514MB 00:05:41.521 EAL: Trying to obtain current memory policy. 00:05:41.521 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:41.521 EAL: Restoring previous memory policy: 4 00:05:41.521 EAL: Calling mem event callback 'spdk:(nil)' 00:05:41.521 EAL: request: mp_malloc_sync 00:05:41.521 EAL: No shared files mode enabled, IPC is disabled 00:05:41.521 EAL: Heap on socket 0 was expanded by 1026MB 00:05:41.779 EAL: Calling mem event callback 'spdk:(nil)' 00:05:42.038 EAL: request: mp_malloc_sync 00:05:42.038 EAL: No shared files mode enabled, IPC is disabled 00:05:42.038 EAL: Heap on socket 0 was shrunk by 1026MB 00:05:42.038 passed 00:05:42.038 00:05:42.039 Run Summary: Type Total Ran Passed Failed Inactive 00:05:42.039 suites 1 1 n/a 0 0 00:05:42.039 tests 2 2 2 0 0 00:05:42.039 asserts 497 497 497 0 n/a 00:05:42.039 00:05:42.039 Elapsed time = 0.958 seconds 00:05:42.039 EAL: Calling mem event callback 'spdk:(nil)' 00:05:42.039 EAL: request: mp_malloc_sync 00:05:42.039 EAL: No shared files mode enabled, IPC is disabled 00:05:42.039 EAL: Heap on socket 0 was shrunk by 2MB 00:05:42.039 EAL: No shared files mode enabled, IPC is disabled 00:05:42.039 EAL: No shared files mode enabled, IPC is disabled 00:05:42.039 EAL: No shared files mode enabled, IPC is disabled 00:05:42.039 00:05:42.039 real 0m1.070s 00:05:42.039 user 0m0.632s 00:05:42.039 sys 0m0.416s 00:05:42.039 10:36:58 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:42.039 10:36:58 -- common/autotest_common.sh@10 -- # set +x 00:05:42.039 ************************************ 00:05:42.039 END TEST env_vtophys 00:05:42.039 ************************************ 00:05:42.039 10:36:58 -- env/env.sh@12 -- # run_test env_pci /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/pci/pci_ut 00:05:42.039 10:36:58 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:42.039 10:36:58 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:42.039 10:36:58 -- common/autotest_common.sh@10 -- # set +x 00:05:42.039 ************************************ 00:05:42.039 START TEST env_pci 00:05:42.039 ************************************ 00:05:42.039 10:36:58 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/pci/pci_ut 00:05:42.039 00:05:42.039 00:05:42.039 CUnit - A unit testing framework for C - Version 2.1-3 00:05:42.039 http://cunit.sourceforge.net/ 00:05:42.039 00:05:42.039 00:05:42.039 Suite: pci 00:05:42.039 Test: pci_hook ...[2024-07-13 10:36:58.240292] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/pci.c:1041:spdk_pci_device_claim: *ERROR*: Cannot create lock on device /var/tmp/spdk_pci_lock_10000:00:01.0, probably process 1965947 has claimed it 00:05:42.039 EAL: Cannot find device (10000:00:01.0) 00:05:42.039 EAL: Failed to attach device on primary process 00:05:42.039 passed 00:05:42.039 00:05:42.039 Run Summary: Type Total Ran Passed Failed Inactive 00:05:42.039 suites 1 1 n/a 0 0 00:05:42.039 tests 1 1 1 0 0 00:05:42.039 asserts 25 25 25 0 n/a 00:05:42.039 00:05:42.039 Elapsed time = 0.037 seconds 00:05:42.039 00:05:42.039 real 0m0.055s 00:05:42.039 user 0m0.012s 00:05:42.039 sys 0m0.043s 00:05:42.039 10:36:58 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:42.039 10:36:58 -- common/autotest_common.sh@10 -- # set +x 00:05:42.039 ************************************ 00:05:42.039 END TEST env_pci 00:05:42.039 ************************************ 00:05:42.039 10:36:58 -- env/env.sh@14 -- # argv='-c 0x1 ' 00:05:42.039 10:36:58 -- env/env.sh@15 -- # uname 00:05:42.039 10:36:58 -- env/env.sh@15 -- # '[' Linux = Linux ']' 00:05:42.039 10:36:58 -- env/env.sh@22 -- # argv+=--base-virtaddr=0x200000000000 00:05:42.039 10:36:58 -- env/env.sh@24 -- # run_test env_dpdk_post_init /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:05:42.039 10:36:58 -- common/autotest_common.sh@1077 -- # '[' 5 -le 1 ']' 00:05:42.039 10:36:58 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:42.039 10:36:58 -- common/autotest_common.sh@10 -- # set +x 00:05:42.039 ************************************ 00:05:42.039 START TEST env_dpdk_post_init 00:05:42.039 ************************************ 00:05:42.039 10:36:58 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:05:42.039 EAL: Detected CPU lcores: 112 00:05:42.039 EAL: Detected NUMA nodes: 2 00:05:42.039 EAL: Detected static linkage of DPDK 00:05:42.039 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:05:42.039 EAL: Selected IOVA mode 'VA' 00:05:42.039 EAL: No free 2048 kB hugepages reported on node 1 00:05:42.039 EAL: VFIO support initialized 00:05:42.039 TELEMETRY: No legacy callbacks, legacy socket not created 00:05:42.299 EAL: Using IOMMU type 1 (Type 1) 00:05:42.868 EAL: Probe PCI driver: spdk_nvme (8086:0a54) device: 0000:d8:00.0 (socket 1) 00:05:47.059 EAL: Releasing PCI mapped resource for 0000:d8:00.0 00:05:47.059 EAL: Calling pci_unmap_resource for 0000:d8:00.0 at 0x202001000000 00:05:47.059 Starting DPDK initialization... 00:05:47.059 Starting SPDK post initialization... 00:05:47.059 SPDK NVMe probe 00:05:47.059 Attaching to 0000:d8:00.0 00:05:47.059 Attached to 0000:d8:00.0 00:05:47.059 Cleaning up... 00:05:47.059 00:05:47.059 real 0m4.736s 00:05:47.059 user 0m3.567s 00:05:47.059 sys 0m0.413s 00:05:47.059 10:37:03 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:47.059 10:37:03 -- common/autotest_common.sh@10 -- # set +x 00:05:47.059 ************************************ 00:05:47.059 END TEST env_dpdk_post_init 00:05:47.059 ************************************ 00:05:47.059 10:37:03 -- env/env.sh@26 -- # uname 00:05:47.059 10:37:03 -- env/env.sh@26 -- # '[' Linux = Linux ']' 00:05:47.059 10:37:03 -- env/env.sh@29 -- # run_test env_mem_callbacks /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:05:47.059 10:37:03 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:47.059 10:37:03 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:47.059 10:37:03 -- common/autotest_common.sh@10 -- # set +x 00:05:47.059 ************************************ 00:05:47.059 START TEST env_mem_callbacks 00:05:47.059 ************************************ 00:05:47.059 10:37:03 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:05:47.059 EAL: Detected CPU lcores: 112 00:05:47.059 EAL: Detected NUMA nodes: 2 00:05:47.059 EAL: Detected static linkage of DPDK 00:05:47.059 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:05:47.059 EAL: Selected IOVA mode 'VA' 00:05:47.059 EAL: No free 2048 kB hugepages reported on node 1 00:05:47.059 EAL: VFIO support initialized 00:05:47.059 TELEMETRY: No legacy callbacks, legacy socket not created 00:05:47.059 00:05:47.059 00:05:47.059 CUnit - A unit testing framework for C - Version 2.1-3 00:05:47.059 http://cunit.sourceforge.net/ 00:05:47.059 00:05:47.059 00:05:47.059 Suite: memory 00:05:47.059 Test: test ... 00:05:47.059 register 0x200000200000 2097152 00:05:47.059 malloc 3145728 00:05:47.059 register 0x200000400000 4194304 00:05:47.059 buf 0x200000500000 len 3145728 PASSED 00:05:47.059 malloc 64 00:05:47.059 buf 0x2000004fff40 len 64 PASSED 00:05:47.059 malloc 4194304 00:05:47.059 register 0x200000800000 6291456 00:05:47.059 buf 0x200000a00000 len 4194304 PASSED 00:05:47.059 free 0x200000500000 3145728 00:05:47.059 free 0x2000004fff40 64 00:05:47.059 unregister 0x200000400000 4194304 PASSED 00:05:47.059 free 0x200000a00000 4194304 00:05:47.059 unregister 0x200000800000 6291456 PASSED 00:05:47.059 malloc 8388608 00:05:47.059 register 0x200000400000 10485760 00:05:47.059 buf 0x200000600000 len 8388608 PASSED 00:05:47.059 free 0x200000600000 8388608 00:05:47.059 unregister 0x200000400000 10485760 PASSED 00:05:47.059 passed 00:05:47.059 00:05:47.059 Run Summary: Type Total Ran Passed Failed Inactive 00:05:47.059 suites 1 1 n/a 0 0 00:05:47.059 tests 1 1 1 0 0 00:05:47.059 asserts 15 15 15 0 n/a 00:05:47.059 00:05:47.059 Elapsed time = 0.005 seconds 00:05:47.059 00:05:47.059 real 0m0.066s 00:05:47.059 user 0m0.015s 00:05:47.059 sys 0m0.050s 00:05:47.059 10:37:03 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:47.059 10:37:03 -- common/autotest_common.sh@10 -- # set +x 00:05:47.059 ************************************ 00:05:47.059 END TEST env_mem_callbacks 00:05:47.059 ************************************ 00:05:47.059 00:05:47.059 real 0m6.350s 00:05:47.059 user 0m4.416s 00:05:47.059 sys 0m1.193s 00:05:47.059 10:37:03 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:47.059 10:37:03 -- common/autotest_common.sh@10 -- # set +x 00:05:47.059 ************************************ 00:05:47.059 END TEST env 00:05:47.059 ************************************ 00:05:47.059 10:37:03 -- spdk/autotest.sh@176 -- # run_test rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/rpc.sh 00:05:47.059 10:37:03 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:47.059 10:37:03 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:47.059 10:37:03 -- common/autotest_common.sh@10 -- # set +x 00:05:47.059 ************************************ 00:05:47.059 START TEST rpc 00:05:47.059 ************************************ 00:05:47.059 10:37:03 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/rpc.sh 00:05:47.059 * Looking for test storage... 00:05:47.059 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:05:47.059 10:37:03 -- rpc/rpc.sh@65 -- # spdk_pid=1967059 00:05:47.059 10:37:03 -- rpc/rpc.sh@66 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:05:47.059 10:37:03 -- rpc/rpc.sh@67 -- # waitforlisten 1967059 00:05:47.059 10:37:03 -- common/autotest_common.sh@819 -- # '[' -z 1967059 ']' 00:05:47.059 10:37:03 -- rpc/rpc.sh@64 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -e bdev 00:05:47.059 10:37:03 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:47.059 10:37:03 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:47.059 10:37:03 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:47.059 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:47.059 10:37:03 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:47.059 10:37:03 -- common/autotest_common.sh@10 -- # set +x 00:05:47.059 [2024-07-13 10:37:03.391819] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:05:47.060 [2024-07-13 10:37:03.391885] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1967059 ] 00:05:47.060 EAL: No free 2048 kB hugepages reported on node 1 00:05:47.318 [2024-07-13 10:37:03.459970] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:47.318 [2024-07-13 10:37:03.499269] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:47.318 [2024-07-13 10:37:03.499374] app.c: 488:app_setup_trace: *NOTICE*: Tracepoint Group Mask bdev specified. 00:05:47.318 [2024-07-13 10:37:03.499384] app.c: 492:app_setup_trace: *NOTICE*: Use 'spdk_trace -s spdk_tgt -p 1967059' to capture a snapshot of events at runtime. 00:05:47.318 [2024-07-13 10:37:03.499393] app.c: 494:app_setup_trace: *NOTICE*: Or copy /dev/shm/spdk_tgt_trace.pid1967059 for offline analysis/debug. 00:05:47.318 [2024-07-13 10:37:03.499412] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:47.887 10:37:04 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:47.887 10:37:04 -- common/autotest_common.sh@852 -- # return 0 00:05:47.887 10:37:04 -- rpc/rpc.sh@69 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:05:47.887 10:37:04 -- rpc/rpc.sh@69 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:05:47.887 10:37:04 -- rpc/rpc.sh@72 -- # rpc=rpc_cmd 00:05:47.887 10:37:04 -- rpc/rpc.sh@73 -- # run_test rpc_integrity rpc_integrity 00:05:47.887 10:37:04 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:47.887 10:37:04 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:47.887 10:37:04 -- common/autotest_common.sh@10 -- # set +x 00:05:47.887 ************************************ 00:05:47.887 START TEST rpc_integrity 00:05:47.887 ************************************ 00:05:47.887 10:37:04 -- common/autotest_common.sh@1104 -- # rpc_integrity 00:05:47.887 10:37:04 -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:05:47.887 10:37:04 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:47.887 10:37:04 -- common/autotest_common.sh@10 -- # set +x 00:05:47.887 10:37:04 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:47.887 10:37:04 -- rpc/rpc.sh@12 -- # bdevs='[]' 00:05:47.887 10:37:04 -- rpc/rpc.sh@13 -- # jq length 00:05:47.887 10:37:04 -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:05:47.887 10:37:04 -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:05:47.887 10:37:04 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:47.887 10:37:04 -- common/autotest_common.sh@10 -- # set +x 00:05:47.887 10:37:04 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:47.887 10:37:04 -- rpc/rpc.sh@15 -- # malloc=Malloc0 00:05:47.887 10:37:04 -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:05:47.887 10:37:04 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:47.887 10:37:04 -- common/autotest_common.sh@10 -- # set +x 00:05:48.146 10:37:04 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:48.146 10:37:04 -- rpc/rpc.sh@16 -- # bdevs='[ 00:05:48.146 { 00:05:48.146 "name": "Malloc0", 00:05:48.146 "aliases": [ 00:05:48.146 "e7684e73-0624-4317-bdae-73d5e63f6cd4" 00:05:48.146 ], 00:05:48.146 "product_name": "Malloc disk", 00:05:48.146 "block_size": 512, 00:05:48.146 "num_blocks": 16384, 00:05:48.146 "uuid": "e7684e73-0624-4317-bdae-73d5e63f6cd4", 00:05:48.146 "assigned_rate_limits": { 00:05:48.146 "rw_ios_per_sec": 0, 00:05:48.146 "rw_mbytes_per_sec": 0, 00:05:48.146 "r_mbytes_per_sec": 0, 00:05:48.146 "w_mbytes_per_sec": 0 00:05:48.146 }, 00:05:48.146 "claimed": false, 00:05:48.146 "zoned": false, 00:05:48.146 "supported_io_types": { 00:05:48.146 "read": true, 00:05:48.146 "write": true, 00:05:48.146 "unmap": true, 00:05:48.146 "write_zeroes": true, 00:05:48.146 "flush": true, 00:05:48.146 "reset": true, 00:05:48.146 "compare": false, 00:05:48.146 "compare_and_write": false, 00:05:48.146 "abort": true, 00:05:48.146 "nvme_admin": false, 00:05:48.146 "nvme_io": false 00:05:48.146 }, 00:05:48.146 "memory_domains": [ 00:05:48.146 { 00:05:48.146 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:48.146 "dma_device_type": 2 00:05:48.146 } 00:05:48.146 ], 00:05:48.146 "driver_specific": {} 00:05:48.146 } 00:05:48.146 ]' 00:05:48.146 10:37:04 -- rpc/rpc.sh@17 -- # jq length 00:05:48.146 10:37:04 -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:05:48.146 10:37:04 -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc0 -p Passthru0 00:05:48.146 10:37:04 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:48.146 10:37:04 -- common/autotest_common.sh@10 -- # set +x 00:05:48.146 [2024-07-13 10:37:04.330732] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc0 00:05:48.146 [2024-07-13 10:37:04.330765] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:05:48.147 [2024-07-13 10:37:04.330779] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x44dd490 00:05:48.147 [2024-07-13 10:37:04.330787] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:05:48.147 [2024-07-13 10:37:04.331599] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:05:48.147 [2024-07-13 10:37:04.331622] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:05:48.147 Passthru0 00:05:48.147 10:37:04 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:48.147 10:37:04 -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:05:48.147 10:37:04 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:48.147 10:37:04 -- common/autotest_common.sh@10 -- # set +x 00:05:48.147 10:37:04 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:48.147 10:37:04 -- rpc/rpc.sh@20 -- # bdevs='[ 00:05:48.147 { 00:05:48.147 "name": "Malloc0", 00:05:48.147 "aliases": [ 00:05:48.147 "e7684e73-0624-4317-bdae-73d5e63f6cd4" 00:05:48.147 ], 00:05:48.147 "product_name": "Malloc disk", 00:05:48.147 "block_size": 512, 00:05:48.147 "num_blocks": 16384, 00:05:48.147 "uuid": "e7684e73-0624-4317-bdae-73d5e63f6cd4", 00:05:48.147 "assigned_rate_limits": { 00:05:48.147 "rw_ios_per_sec": 0, 00:05:48.147 "rw_mbytes_per_sec": 0, 00:05:48.147 "r_mbytes_per_sec": 0, 00:05:48.147 "w_mbytes_per_sec": 0 00:05:48.147 }, 00:05:48.147 "claimed": true, 00:05:48.147 "claim_type": "exclusive_write", 00:05:48.147 "zoned": false, 00:05:48.147 "supported_io_types": { 00:05:48.147 "read": true, 00:05:48.147 "write": true, 00:05:48.147 "unmap": true, 00:05:48.147 "write_zeroes": true, 00:05:48.147 "flush": true, 00:05:48.147 "reset": true, 00:05:48.147 "compare": false, 00:05:48.147 "compare_and_write": false, 00:05:48.147 "abort": true, 00:05:48.147 "nvme_admin": false, 00:05:48.147 "nvme_io": false 00:05:48.147 }, 00:05:48.147 "memory_domains": [ 00:05:48.147 { 00:05:48.147 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:48.147 "dma_device_type": 2 00:05:48.147 } 00:05:48.147 ], 00:05:48.147 "driver_specific": {} 00:05:48.147 }, 00:05:48.147 { 00:05:48.147 "name": "Passthru0", 00:05:48.147 "aliases": [ 00:05:48.147 "1e51a35b-4a89-544c-b051-6999272236c3" 00:05:48.147 ], 00:05:48.147 "product_name": "passthru", 00:05:48.147 "block_size": 512, 00:05:48.147 "num_blocks": 16384, 00:05:48.147 "uuid": "1e51a35b-4a89-544c-b051-6999272236c3", 00:05:48.147 "assigned_rate_limits": { 00:05:48.147 "rw_ios_per_sec": 0, 00:05:48.147 "rw_mbytes_per_sec": 0, 00:05:48.147 "r_mbytes_per_sec": 0, 00:05:48.147 "w_mbytes_per_sec": 0 00:05:48.147 }, 00:05:48.147 "claimed": false, 00:05:48.147 "zoned": false, 00:05:48.147 "supported_io_types": { 00:05:48.147 "read": true, 00:05:48.147 "write": true, 00:05:48.147 "unmap": true, 00:05:48.147 "write_zeroes": true, 00:05:48.147 "flush": true, 00:05:48.147 "reset": true, 00:05:48.147 "compare": false, 00:05:48.147 "compare_and_write": false, 00:05:48.147 "abort": true, 00:05:48.147 "nvme_admin": false, 00:05:48.147 "nvme_io": false 00:05:48.147 }, 00:05:48.147 "memory_domains": [ 00:05:48.147 { 00:05:48.147 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:48.147 "dma_device_type": 2 00:05:48.147 } 00:05:48.147 ], 00:05:48.147 "driver_specific": { 00:05:48.147 "passthru": { 00:05:48.147 "name": "Passthru0", 00:05:48.147 "base_bdev_name": "Malloc0" 00:05:48.147 } 00:05:48.147 } 00:05:48.147 } 00:05:48.147 ]' 00:05:48.147 10:37:04 -- rpc/rpc.sh@21 -- # jq length 00:05:48.147 10:37:04 -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:05:48.147 10:37:04 -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:05:48.147 10:37:04 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:48.147 10:37:04 -- common/autotest_common.sh@10 -- # set +x 00:05:48.147 10:37:04 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:48.147 10:37:04 -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc0 00:05:48.147 10:37:04 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:48.147 10:37:04 -- common/autotest_common.sh@10 -- # set +x 00:05:48.147 10:37:04 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:48.147 10:37:04 -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:05:48.147 10:37:04 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:48.147 10:37:04 -- common/autotest_common.sh@10 -- # set +x 00:05:48.147 10:37:04 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:48.147 10:37:04 -- rpc/rpc.sh@25 -- # bdevs='[]' 00:05:48.147 10:37:04 -- rpc/rpc.sh@26 -- # jq length 00:05:48.147 10:37:04 -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:05:48.147 00:05:48.147 real 0m0.280s 00:05:48.147 user 0m0.168s 00:05:48.147 sys 0m0.043s 00:05:48.147 10:37:04 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:48.147 10:37:04 -- common/autotest_common.sh@10 -- # set +x 00:05:48.147 ************************************ 00:05:48.147 END TEST rpc_integrity 00:05:48.147 ************************************ 00:05:48.147 10:37:04 -- rpc/rpc.sh@74 -- # run_test rpc_plugins rpc_plugins 00:05:48.147 10:37:04 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:48.147 10:37:04 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:48.147 10:37:04 -- common/autotest_common.sh@10 -- # set +x 00:05:48.147 ************************************ 00:05:48.147 START TEST rpc_plugins 00:05:48.147 ************************************ 00:05:48.147 10:37:04 -- common/autotest_common.sh@1104 -- # rpc_plugins 00:05:48.147 10:37:04 -- rpc/rpc.sh@30 -- # rpc_cmd --plugin rpc_plugin create_malloc 00:05:48.147 10:37:04 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:48.147 10:37:04 -- common/autotest_common.sh@10 -- # set +x 00:05:48.407 10:37:04 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:48.407 10:37:04 -- rpc/rpc.sh@30 -- # malloc=Malloc1 00:05:48.407 10:37:04 -- rpc/rpc.sh@31 -- # rpc_cmd bdev_get_bdevs 00:05:48.407 10:37:04 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:48.407 10:37:04 -- common/autotest_common.sh@10 -- # set +x 00:05:48.407 10:37:04 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:48.407 10:37:04 -- rpc/rpc.sh@31 -- # bdevs='[ 00:05:48.407 { 00:05:48.407 "name": "Malloc1", 00:05:48.407 "aliases": [ 00:05:48.407 "5a419f34-db77-41eb-a2b5-88e4c2eb42bc" 00:05:48.407 ], 00:05:48.407 "product_name": "Malloc disk", 00:05:48.407 "block_size": 4096, 00:05:48.407 "num_blocks": 256, 00:05:48.407 "uuid": "5a419f34-db77-41eb-a2b5-88e4c2eb42bc", 00:05:48.407 "assigned_rate_limits": { 00:05:48.407 "rw_ios_per_sec": 0, 00:05:48.407 "rw_mbytes_per_sec": 0, 00:05:48.407 "r_mbytes_per_sec": 0, 00:05:48.407 "w_mbytes_per_sec": 0 00:05:48.407 }, 00:05:48.407 "claimed": false, 00:05:48.407 "zoned": false, 00:05:48.407 "supported_io_types": { 00:05:48.407 "read": true, 00:05:48.407 "write": true, 00:05:48.407 "unmap": true, 00:05:48.407 "write_zeroes": true, 00:05:48.407 "flush": true, 00:05:48.407 "reset": true, 00:05:48.407 "compare": false, 00:05:48.407 "compare_and_write": false, 00:05:48.407 "abort": true, 00:05:48.407 "nvme_admin": false, 00:05:48.407 "nvme_io": false 00:05:48.407 }, 00:05:48.407 "memory_domains": [ 00:05:48.407 { 00:05:48.407 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:48.407 "dma_device_type": 2 00:05:48.407 } 00:05:48.407 ], 00:05:48.407 "driver_specific": {} 00:05:48.407 } 00:05:48.407 ]' 00:05:48.407 10:37:04 -- rpc/rpc.sh@32 -- # jq length 00:05:48.407 10:37:04 -- rpc/rpc.sh@32 -- # '[' 1 == 1 ']' 00:05:48.407 10:37:04 -- rpc/rpc.sh@34 -- # rpc_cmd --plugin rpc_plugin delete_malloc Malloc1 00:05:48.407 10:37:04 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:48.407 10:37:04 -- common/autotest_common.sh@10 -- # set +x 00:05:48.407 10:37:04 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:48.407 10:37:04 -- rpc/rpc.sh@35 -- # rpc_cmd bdev_get_bdevs 00:05:48.407 10:37:04 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:48.407 10:37:04 -- common/autotest_common.sh@10 -- # set +x 00:05:48.407 10:37:04 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:48.407 10:37:04 -- rpc/rpc.sh@35 -- # bdevs='[]' 00:05:48.407 10:37:04 -- rpc/rpc.sh@36 -- # jq length 00:05:48.407 10:37:04 -- rpc/rpc.sh@36 -- # '[' 0 == 0 ']' 00:05:48.407 00:05:48.407 real 0m0.138s 00:05:48.407 user 0m0.075s 00:05:48.407 sys 0m0.026s 00:05:48.407 10:37:04 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:48.407 10:37:04 -- common/autotest_common.sh@10 -- # set +x 00:05:48.407 ************************************ 00:05:48.407 END TEST rpc_plugins 00:05:48.407 ************************************ 00:05:48.407 10:37:04 -- rpc/rpc.sh@75 -- # run_test rpc_trace_cmd_test rpc_trace_cmd_test 00:05:48.407 10:37:04 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:48.407 10:37:04 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:48.407 10:37:04 -- common/autotest_common.sh@10 -- # set +x 00:05:48.407 ************************************ 00:05:48.407 START TEST rpc_trace_cmd_test 00:05:48.407 ************************************ 00:05:48.407 10:37:04 -- common/autotest_common.sh@1104 -- # rpc_trace_cmd_test 00:05:48.407 10:37:04 -- rpc/rpc.sh@40 -- # local info 00:05:48.407 10:37:04 -- rpc/rpc.sh@42 -- # rpc_cmd trace_get_info 00:05:48.407 10:37:04 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:48.407 10:37:04 -- common/autotest_common.sh@10 -- # set +x 00:05:48.407 10:37:04 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:48.407 10:37:04 -- rpc/rpc.sh@42 -- # info='{ 00:05:48.407 "tpoint_shm_path": "/dev/shm/spdk_tgt_trace.pid1967059", 00:05:48.407 "tpoint_group_mask": "0x8", 00:05:48.407 "iscsi_conn": { 00:05:48.407 "mask": "0x2", 00:05:48.407 "tpoint_mask": "0x0" 00:05:48.407 }, 00:05:48.407 "scsi": { 00:05:48.407 "mask": "0x4", 00:05:48.407 "tpoint_mask": "0x0" 00:05:48.407 }, 00:05:48.407 "bdev": { 00:05:48.407 "mask": "0x8", 00:05:48.407 "tpoint_mask": "0xffffffffffffffff" 00:05:48.407 }, 00:05:48.407 "nvmf_rdma": { 00:05:48.407 "mask": "0x10", 00:05:48.407 "tpoint_mask": "0x0" 00:05:48.407 }, 00:05:48.407 "nvmf_tcp": { 00:05:48.407 "mask": "0x20", 00:05:48.407 "tpoint_mask": "0x0" 00:05:48.407 }, 00:05:48.407 "ftl": { 00:05:48.407 "mask": "0x40", 00:05:48.407 "tpoint_mask": "0x0" 00:05:48.407 }, 00:05:48.407 "blobfs": { 00:05:48.407 "mask": "0x80", 00:05:48.407 "tpoint_mask": "0x0" 00:05:48.407 }, 00:05:48.407 "dsa": { 00:05:48.407 "mask": "0x200", 00:05:48.407 "tpoint_mask": "0x0" 00:05:48.407 }, 00:05:48.407 "thread": { 00:05:48.407 "mask": "0x400", 00:05:48.407 "tpoint_mask": "0x0" 00:05:48.407 }, 00:05:48.407 "nvme_pcie": { 00:05:48.407 "mask": "0x800", 00:05:48.407 "tpoint_mask": "0x0" 00:05:48.407 }, 00:05:48.407 "iaa": { 00:05:48.407 "mask": "0x1000", 00:05:48.407 "tpoint_mask": "0x0" 00:05:48.407 }, 00:05:48.407 "nvme_tcp": { 00:05:48.407 "mask": "0x2000", 00:05:48.407 "tpoint_mask": "0x0" 00:05:48.407 }, 00:05:48.407 "bdev_nvme": { 00:05:48.407 "mask": "0x4000", 00:05:48.407 "tpoint_mask": "0x0" 00:05:48.407 } 00:05:48.407 }' 00:05:48.407 10:37:04 -- rpc/rpc.sh@43 -- # jq length 00:05:48.407 10:37:04 -- rpc/rpc.sh@43 -- # '[' 15 -gt 2 ']' 00:05:48.407 10:37:04 -- rpc/rpc.sh@44 -- # jq 'has("tpoint_group_mask")' 00:05:48.667 10:37:04 -- rpc/rpc.sh@44 -- # '[' true = true ']' 00:05:48.667 10:37:04 -- rpc/rpc.sh@45 -- # jq 'has("tpoint_shm_path")' 00:05:48.667 10:37:04 -- rpc/rpc.sh@45 -- # '[' true = true ']' 00:05:48.667 10:37:04 -- rpc/rpc.sh@46 -- # jq 'has("bdev")' 00:05:48.667 10:37:04 -- rpc/rpc.sh@46 -- # '[' true = true ']' 00:05:48.667 10:37:04 -- rpc/rpc.sh@47 -- # jq -r .bdev.tpoint_mask 00:05:48.667 10:37:04 -- rpc/rpc.sh@47 -- # '[' 0xffffffffffffffff '!=' 0x0 ']' 00:05:48.667 00:05:48.667 real 0m0.232s 00:05:48.667 user 0m0.192s 00:05:48.667 sys 0m0.034s 00:05:48.667 10:37:04 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:48.667 10:37:04 -- common/autotest_common.sh@10 -- # set +x 00:05:48.667 ************************************ 00:05:48.667 END TEST rpc_trace_cmd_test 00:05:48.667 ************************************ 00:05:48.667 10:37:04 -- rpc/rpc.sh@76 -- # [[ 0 -eq 1 ]] 00:05:48.667 10:37:04 -- rpc/rpc.sh@80 -- # rpc=rpc_cmd 00:05:48.667 10:37:04 -- rpc/rpc.sh@81 -- # run_test rpc_daemon_integrity rpc_integrity 00:05:48.667 10:37:04 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:48.667 10:37:04 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:48.667 10:37:04 -- common/autotest_common.sh@10 -- # set +x 00:05:48.667 ************************************ 00:05:48.667 START TEST rpc_daemon_integrity 00:05:48.667 ************************************ 00:05:48.667 10:37:04 -- common/autotest_common.sh@1104 -- # rpc_integrity 00:05:48.667 10:37:04 -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:05:48.667 10:37:04 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:48.667 10:37:04 -- common/autotest_common.sh@10 -- # set +x 00:05:48.667 10:37:04 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:48.667 10:37:05 -- rpc/rpc.sh@12 -- # bdevs='[]' 00:05:48.667 10:37:05 -- rpc/rpc.sh@13 -- # jq length 00:05:48.667 10:37:05 -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:05:48.667 10:37:05 -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:05:48.667 10:37:05 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:48.667 10:37:05 -- common/autotest_common.sh@10 -- # set +x 00:05:48.927 10:37:05 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:48.927 10:37:05 -- rpc/rpc.sh@15 -- # malloc=Malloc2 00:05:48.927 10:37:05 -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:05:48.927 10:37:05 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:48.927 10:37:05 -- common/autotest_common.sh@10 -- # set +x 00:05:48.927 10:37:05 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:48.927 10:37:05 -- rpc/rpc.sh@16 -- # bdevs='[ 00:05:48.927 { 00:05:48.927 "name": "Malloc2", 00:05:48.927 "aliases": [ 00:05:48.927 "4faac571-5236-4b41-a2f0-075fb1b007b2" 00:05:48.927 ], 00:05:48.927 "product_name": "Malloc disk", 00:05:48.927 "block_size": 512, 00:05:48.927 "num_blocks": 16384, 00:05:48.927 "uuid": "4faac571-5236-4b41-a2f0-075fb1b007b2", 00:05:48.927 "assigned_rate_limits": { 00:05:48.927 "rw_ios_per_sec": 0, 00:05:48.927 "rw_mbytes_per_sec": 0, 00:05:48.927 "r_mbytes_per_sec": 0, 00:05:48.927 "w_mbytes_per_sec": 0 00:05:48.927 }, 00:05:48.927 "claimed": false, 00:05:48.927 "zoned": false, 00:05:48.927 "supported_io_types": { 00:05:48.927 "read": true, 00:05:48.927 "write": true, 00:05:48.927 "unmap": true, 00:05:48.927 "write_zeroes": true, 00:05:48.927 "flush": true, 00:05:48.927 "reset": true, 00:05:48.927 "compare": false, 00:05:48.927 "compare_and_write": false, 00:05:48.927 "abort": true, 00:05:48.927 "nvme_admin": false, 00:05:48.927 "nvme_io": false 00:05:48.927 }, 00:05:48.927 "memory_domains": [ 00:05:48.927 { 00:05:48.927 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:48.927 "dma_device_type": 2 00:05:48.927 } 00:05:48.927 ], 00:05:48.927 "driver_specific": {} 00:05:48.927 } 00:05:48.927 ]' 00:05:48.927 10:37:05 -- rpc/rpc.sh@17 -- # jq length 00:05:48.927 10:37:05 -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:05:48.927 10:37:05 -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc2 -p Passthru0 00:05:48.927 10:37:05 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:48.927 10:37:05 -- common/autotest_common.sh@10 -- # set +x 00:05:48.927 [2024-07-13 10:37:05.124794] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc2 00:05:48.927 [2024-07-13 10:37:05.124828] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:05:48.927 [2024-07-13 10:37:05.124843] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x44dced0 00:05:48.927 [2024-07-13 10:37:05.124852] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:05:48.927 [2024-07-13 10:37:05.125524] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:05:48.927 [2024-07-13 10:37:05.125546] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:05:48.927 Passthru0 00:05:48.927 10:37:05 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:48.927 10:37:05 -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:05:48.927 10:37:05 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:48.927 10:37:05 -- common/autotest_common.sh@10 -- # set +x 00:05:48.927 10:37:05 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:48.927 10:37:05 -- rpc/rpc.sh@20 -- # bdevs='[ 00:05:48.927 { 00:05:48.927 "name": "Malloc2", 00:05:48.927 "aliases": [ 00:05:48.927 "4faac571-5236-4b41-a2f0-075fb1b007b2" 00:05:48.927 ], 00:05:48.927 "product_name": "Malloc disk", 00:05:48.927 "block_size": 512, 00:05:48.927 "num_blocks": 16384, 00:05:48.927 "uuid": "4faac571-5236-4b41-a2f0-075fb1b007b2", 00:05:48.927 "assigned_rate_limits": { 00:05:48.927 "rw_ios_per_sec": 0, 00:05:48.927 "rw_mbytes_per_sec": 0, 00:05:48.927 "r_mbytes_per_sec": 0, 00:05:48.927 "w_mbytes_per_sec": 0 00:05:48.927 }, 00:05:48.927 "claimed": true, 00:05:48.927 "claim_type": "exclusive_write", 00:05:48.927 "zoned": false, 00:05:48.927 "supported_io_types": { 00:05:48.927 "read": true, 00:05:48.927 "write": true, 00:05:48.927 "unmap": true, 00:05:48.927 "write_zeroes": true, 00:05:48.927 "flush": true, 00:05:48.927 "reset": true, 00:05:48.927 "compare": false, 00:05:48.927 "compare_and_write": false, 00:05:48.927 "abort": true, 00:05:48.927 "nvme_admin": false, 00:05:48.927 "nvme_io": false 00:05:48.927 }, 00:05:48.927 "memory_domains": [ 00:05:48.927 { 00:05:48.927 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:48.927 "dma_device_type": 2 00:05:48.927 } 00:05:48.927 ], 00:05:48.927 "driver_specific": {} 00:05:48.927 }, 00:05:48.927 { 00:05:48.927 "name": "Passthru0", 00:05:48.927 "aliases": [ 00:05:48.927 "29e8089a-c73d-5310-95a9-5759a15476db" 00:05:48.927 ], 00:05:48.927 "product_name": "passthru", 00:05:48.927 "block_size": 512, 00:05:48.927 "num_blocks": 16384, 00:05:48.927 "uuid": "29e8089a-c73d-5310-95a9-5759a15476db", 00:05:48.927 "assigned_rate_limits": { 00:05:48.927 "rw_ios_per_sec": 0, 00:05:48.927 "rw_mbytes_per_sec": 0, 00:05:48.927 "r_mbytes_per_sec": 0, 00:05:48.927 "w_mbytes_per_sec": 0 00:05:48.927 }, 00:05:48.927 "claimed": false, 00:05:48.927 "zoned": false, 00:05:48.927 "supported_io_types": { 00:05:48.927 "read": true, 00:05:48.927 "write": true, 00:05:48.927 "unmap": true, 00:05:48.927 "write_zeroes": true, 00:05:48.927 "flush": true, 00:05:48.927 "reset": true, 00:05:48.927 "compare": false, 00:05:48.927 "compare_and_write": false, 00:05:48.927 "abort": true, 00:05:48.927 "nvme_admin": false, 00:05:48.927 "nvme_io": false 00:05:48.927 }, 00:05:48.927 "memory_domains": [ 00:05:48.927 { 00:05:48.927 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:48.927 "dma_device_type": 2 00:05:48.927 } 00:05:48.927 ], 00:05:48.927 "driver_specific": { 00:05:48.927 "passthru": { 00:05:48.927 "name": "Passthru0", 00:05:48.927 "base_bdev_name": "Malloc2" 00:05:48.927 } 00:05:48.927 } 00:05:48.927 } 00:05:48.927 ]' 00:05:48.927 10:37:05 -- rpc/rpc.sh@21 -- # jq length 00:05:48.927 10:37:05 -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:05:48.927 10:37:05 -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:05:48.927 10:37:05 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:48.927 10:37:05 -- common/autotest_common.sh@10 -- # set +x 00:05:48.927 10:37:05 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:48.927 10:37:05 -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc2 00:05:48.927 10:37:05 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:48.927 10:37:05 -- common/autotest_common.sh@10 -- # set +x 00:05:48.927 10:37:05 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:48.927 10:37:05 -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:05:48.927 10:37:05 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:48.927 10:37:05 -- common/autotest_common.sh@10 -- # set +x 00:05:48.927 10:37:05 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:48.927 10:37:05 -- rpc/rpc.sh@25 -- # bdevs='[]' 00:05:48.927 10:37:05 -- rpc/rpc.sh@26 -- # jq length 00:05:48.927 10:37:05 -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:05:48.927 00:05:48.927 real 0m0.277s 00:05:48.927 user 0m0.162s 00:05:48.927 sys 0m0.049s 00:05:48.927 10:37:05 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:48.927 10:37:05 -- common/autotest_common.sh@10 -- # set +x 00:05:48.927 ************************************ 00:05:48.927 END TEST rpc_daemon_integrity 00:05:48.927 ************************************ 00:05:48.927 10:37:05 -- rpc/rpc.sh@83 -- # trap - SIGINT SIGTERM EXIT 00:05:48.927 10:37:05 -- rpc/rpc.sh@84 -- # killprocess 1967059 00:05:48.927 10:37:05 -- common/autotest_common.sh@926 -- # '[' -z 1967059 ']' 00:05:48.927 10:37:05 -- common/autotest_common.sh@930 -- # kill -0 1967059 00:05:48.927 10:37:05 -- common/autotest_common.sh@931 -- # uname 00:05:48.927 10:37:05 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:05:49.187 10:37:05 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 1967059 00:05:49.187 10:37:05 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:05:49.187 10:37:05 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:05:49.187 10:37:05 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 1967059' 00:05:49.187 killing process with pid 1967059 00:05:49.187 10:37:05 -- common/autotest_common.sh@945 -- # kill 1967059 00:05:49.187 10:37:05 -- common/autotest_common.sh@950 -- # wait 1967059 00:05:49.446 00:05:49.446 real 0m2.367s 00:05:49.446 user 0m2.959s 00:05:49.446 sys 0m0.722s 00:05:49.446 10:37:05 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:49.446 10:37:05 -- common/autotest_common.sh@10 -- # set +x 00:05:49.446 ************************************ 00:05:49.446 END TEST rpc 00:05:49.446 ************************************ 00:05:49.446 10:37:05 -- spdk/autotest.sh@177 -- # run_test rpc_client /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:05:49.446 10:37:05 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:49.446 10:37:05 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:49.446 10:37:05 -- common/autotest_common.sh@10 -- # set +x 00:05:49.446 ************************************ 00:05:49.446 START TEST rpc_client 00:05:49.446 ************************************ 00:05:49.446 10:37:05 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:05:49.446 * Looking for test storage... 00:05:49.446 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client 00:05:49.446 10:37:05 -- rpc_client/rpc_client.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client_test 00:05:49.446 OK 00:05:49.446 10:37:05 -- rpc_client/rpc_client.sh@12 -- # trap - SIGINT SIGTERM EXIT 00:05:49.446 00:05:49.446 real 0m0.111s 00:05:49.446 user 0m0.044s 00:05:49.446 sys 0m0.076s 00:05:49.446 10:37:05 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:49.446 10:37:05 -- common/autotest_common.sh@10 -- # set +x 00:05:49.446 ************************************ 00:05:49.446 END TEST rpc_client 00:05:49.446 ************************************ 00:05:49.707 10:37:05 -- spdk/autotest.sh@178 -- # run_test json_config /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config.sh 00:05:49.707 10:37:05 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:49.707 10:37:05 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:49.707 10:37:05 -- common/autotest_common.sh@10 -- # set +x 00:05:49.707 ************************************ 00:05:49.707 START TEST json_config 00:05:49.707 ************************************ 00:05:49.707 10:37:05 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config.sh 00:05:49.707 10:37:05 -- json_config/json_config.sh@8 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:05:49.707 10:37:05 -- nvmf/common.sh@7 -- # uname -s 00:05:49.707 10:37:05 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:05:49.707 10:37:05 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:05:49.707 10:37:05 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:05:49.707 10:37:05 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:05:49.707 10:37:05 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:05:49.707 10:37:05 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:05:49.707 10:37:05 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:05:49.707 10:37:05 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:05:49.707 10:37:05 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:05:49.707 10:37:05 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:05:49.707 10:37:05 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809b5fbc-4be7-e711-906e-0017a4403562 00:05:49.707 10:37:05 -- nvmf/common.sh@18 -- # NVME_HOSTID=809b5fbc-4be7-e711-906e-0017a4403562 00:05:49.707 10:37:05 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:05:49.707 10:37:05 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:05:49.707 10:37:05 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:05:49.707 10:37:05 -- nvmf/common.sh@44 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:05:49.707 10:37:05 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:05:49.707 10:37:05 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:05:49.707 10:37:05 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:05:49.707 10:37:05 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:49.707 10:37:05 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:49.707 10:37:05 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:49.707 10:37:05 -- paths/export.sh@5 -- # export PATH 00:05:49.707 10:37:05 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:49.707 10:37:05 -- nvmf/common.sh@46 -- # : 0 00:05:49.707 10:37:05 -- nvmf/common.sh@47 -- # export NVMF_APP_SHM_ID 00:05:49.707 10:37:05 -- nvmf/common.sh@48 -- # build_nvmf_app_args 00:05:49.707 10:37:05 -- nvmf/common.sh@24 -- # '[' 0 -eq 1 ']' 00:05:49.707 10:37:05 -- nvmf/common.sh@28 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:05:49.707 10:37:05 -- nvmf/common.sh@30 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:05:49.707 10:37:05 -- nvmf/common.sh@32 -- # '[' -n '' ']' 00:05:49.707 10:37:05 -- nvmf/common.sh@34 -- # '[' 0 -eq 1 ']' 00:05:49.707 10:37:05 -- nvmf/common.sh@50 -- # have_pci_nics=0 00:05:49.707 10:37:05 -- json_config/json_config.sh@10 -- # [[ 0 -eq 1 ]] 00:05:49.707 10:37:05 -- json_config/json_config.sh@14 -- # [[ 0 -ne 1 ]] 00:05:49.707 10:37:05 -- json_config/json_config.sh@14 -- # [[ 0 -eq 1 ]] 00:05:49.707 10:37:05 -- json_config/json_config.sh@25 -- # (( SPDK_TEST_BLOCKDEV + SPDK_TEST_ISCSI + SPDK_TEST_NVMF + SPDK_TEST_VHOST + SPDK_TEST_VHOST_INIT + SPDK_TEST_RBD == 0 )) 00:05:49.707 10:37:05 -- json_config/json_config.sh@26 -- # echo 'WARNING: No tests are enabled so not running JSON configuration tests' 00:05:49.707 WARNING: No tests are enabled so not running JSON configuration tests 00:05:49.707 10:37:05 -- json_config/json_config.sh@27 -- # exit 0 00:05:49.707 00:05:49.707 real 0m0.098s 00:05:49.707 user 0m0.052s 00:05:49.707 sys 0m0.047s 00:05:49.707 10:37:05 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:49.707 10:37:05 -- common/autotest_common.sh@10 -- # set +x 00:05:49.707 ************************************ 00:05:49.707 END TEST json_config 00:05:49.707 ************************************ 00:05:49.707 10:37:05 -- spdk/autotest.sh@179 -- # run_test json_config_extra_key /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:05:49.707 10:37:05 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:49.707 10:37:05 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:49.707 10:37:05 -- common/autotest_common.sh@10 -- # set +x 00:05:49.707 ************************************ 00:05:49.707 START TEST json_config_extra_key 00:05:49.707 ************************************ 00:05:49.707 10:37:06 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:05:49.707 10:37:06 -- json_config/json_config_extra_key.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:05:49.707 10:37:06 -- nvmf/common.sh@7 -- # uname -s 00:05:49.707 10:37:06 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:05:49.707 10:37:06 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:05:49.707 10:37:06 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:05:49.707 10:37:06 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:05:49.707 10:37:06 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:05:49.707 10:37:06 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:05:49.707 10:37:06 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:05:49.707 10:37:06 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:05:49.707 10:37:06 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:05:49.707 10:37:06 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:05:49.707 10:37:06 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809b5fbc-4be7-e711-906e-0017a4403562 00:05:49.707 10:37:06 -- nvmf/common.sh@18 -- # NVME_HOSTID=809b5fbc-4be7-e711-906e-0017a4403562 00:05:49.707 10:37:06 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:05:49.707 10:37:06 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:05:49.707 10:37:06 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:05:49.707 10:37:06 -- nvmf/common.sh@44 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:05:49.707 10:37:06 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:05:49.707 10:37:06 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:05:49.708 10:37:06 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:05:49.708 10:37:06 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:49.708 10:37:06 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:49.708 10:37:06 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:49.708 10:37:06 -- paths/export.sh@5 -- # export PATH 00:05:49.708 10:37:06 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:49.708 10:37:06 -- nvmf/common.sh@46 -- # : 0 00:05:49.708 10:37:06 -- nvmf/common.sh@47 -- # export NVMF_APP_SHM_ID 00:05:49.708 10:37:06 -- nvmf/common.sh@48 -- # build_nvmf_app_args 00:05:49.708 10:37:06 -- nvmf/common.sh@24 -- # '[' 0 -eq 1 ']' 00:05:49.708 10:37:06 -- nvmf/common.sh@28 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:05:49.708 10:37:06 -- nvmf/common.sh@30 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:05:49.708 10:37:06 -- nvmf/common.sh@32 -- # '[' -n '' ']' 00:05:49.968 10:37:06 -- nvmf/common.sh@34 -- # '[' 0 -eq 1 ']' 00:05:49.968 10:37:06 -- nvmf/common.sh@50 -- # have_pci_nics=0 00:05:49.968 10:37:06 -- json_config/json_config_extra_key.sh@16 -- # app_pid=(['target']='') 00:05:49.968 10:37:06 -- json_config/json_config_extra_key.sh@16 -- # declare -A app_pid 00:05:49.968 10:37:06 -- json_config/json_config_extra_key.sh@17 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock') 00:05:49.968 10:37:06 -- json_config/json_config_extra_key.sh@17 -- # declare -A app_socket 00:05:49.968 10:37:06 -- json_config/json_config_extra_key.sh@18 -- # app_params=(['target']='-m 0x1 -s 1024') 00:05:49.968 10:37:06 -- json_config/json_config_extra_key.sh@18 -- # declare -A app_params 00:05:49.968 10:37:06 -- json_config/json_config_extra_key.sh@19 -- # configs_path=(['target']='/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json') 00:05:49.968 10:37:06 -- json_config/json_config_extra_key.sh@19 -- # declare -A configs_path 00:05:49.968 10:37:06 -- json_config/json_config_extra_key.sh@74 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:05:49.968 10:37:06 -- json_config/json_config_extra_key.sh@76 -- # echo 'INFO: launching applications...' 00:05:49.968 INFO: launching applications... 00:05:49.968 10:37:06 -- json_config/json_config_extra_key.sh@77 -- # json_config_test_start_app target --json /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json 00:05:49.968 10:37:06 -- json_config/json_config_extra_key.sh@24 -- # local app=target 00:05:49.968 10:37:06 -- json_config/json_config_extra_key.sh@25 -- # shift 00:05:49.968 10:37:06 -- json_config/json_config_extra_key.sh@27 -- # [[ -n 22 ]] 00:05:49.968 10:37:06 -- json_config/json_config_extra_key.sh@28 -- # [[ -z '' ]] 00:05:49.968 10:37:06 -- json_config/json_config_extra_key.sh@31 -- # app_pid[$app]=1967716 00:05:49.968 10:37:06 -- json_config/json_config_extra_key.sh@33 -- # echo 'Waiting for target to run...' 00:05:49.968 Waiting for target to run... 00:05:49.968 10:37:06 -- json_config/json_config_extra_key.sh@34 -- # waitforlisten 1967716 /var/tmp/spdk_tgt.sock 00:05:49.968 10:37:06 -- json_config/json_config_extra_key.sh@30 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json 00:05:49.968 10:37:06 -- common/autotest_common.sh@819 -- # '[' -z 1967716 ']' 00:05:49.968 10:37:06 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:05:49.968 10:37:06 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:49.968 10:37:06 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:05:49.968 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:05:49.968 10:37:06 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:49.968 10:37:06 -- common/autotest_common.sh@10 -- # set +x 00:05:49.968 [2024-07-13 10:37:06.124035] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:05:49.968 [2024-07-13 10:37:06.124102] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1967716 ] 00:05:49.968 EAL: No free 2048 kB hugepages reported on node 1 00:05:50.226 [2024-07-13 10:37:06.404903] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:50.226 [2024-07-13 10:37:06.424315] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:50.226 [2024-07-13 10:37:06.424406] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:50.794 10:37:06 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:50.794 10:37:06 -- common/autotest_common.sh@852 -- # return 0 00:05:50.794 10:37:06 -- json_config/json_config_extra_key.sh@35 -- # echo '' 00:05:50.794 00:05:50.794 10:37:06 -- json_config/json_config_extra_key.sh@79 -- # echo 'INFO: shutting down applications...' 00:05:50.794 INFO: shutting down applications... 00:05:50.794 10:37:06 -- json_config/json_config_extra_key.sh@80 -- # json_config_test_shutdown_app target 00:05:50.794 10:37:06 -- json_config/json_config_extra_key.sh@40 -- # local app=target 00:05:50.794 10:37:06 -- json_config/json_config_extra_key.sh@43 -- # [[ -n 22 ]] 00:05:50.794 10:37:06 -- json_config/json_config_extra_key.sh@44 -- # [[ -n 1967716 ]] 00:05:50.794 10:37:06 -- json_config/json_config_extra_key.sh@47 -- # kill -SIGINT 1967716 00:05:50.794 10:37:06 -- json_config/json_config_extra_key.sh@49 -- # (( i = 0 )) 00:05:50.794 10:37:06 -- json_config/json_config_extra_key.sh@49 -- # (( i < 30 )) 00:05:50.794 10:37:06 -- json_config/json_config_extra_key.sh@50 -- # kill -0 1967716 00:05:50.794 10:37:06 -- json_config/json_config_extra_key.sh@54 -- # sleep 0.5 00:05:51.053 10:37:07 -- json_config/json_config_extra_key.sh@49 -- # (( i++ )) 00:05:51.053 10:37:07 -- json_config/json_config_extra_key.sh@49 -- # (( i < 30 )) 00:05:51.053 10:37:07 -- json_config/json_config_extra_key.sh@50 -- # kill -0 1967716 00:05:51.053 10:37:07 -- json_config/json_config_extra_key.sh@51 -- # app_pid[$app]= 00:05:51.053 10:37:07 -- json_config/json_config_extra_key.sh@52 -- # break 00:05:51.053 10:37:07 -- json_config/json_config_extra_key.sh@57 -- # [[ -n '' ]] 00:05:51.053 10:37:07 -- json_config/json_config_extra_key.sh@62 -- # echo 'SPDK target shutdown done' 00:05:51.053 SPDK target shutdown done 00:05:51.053 10:37:07 -- json_config/json_config_extra_key.sh@82 -- # echo Success 00:05:51.053 Success 00:05:51.053 00:05:51.053 real 0m1.428s 00:05:51.053 user 0m1.152s 00:05:51.053 sys 0m0.386s 00:05:51.053 10:37:07 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:51.053 10:37:07 -- common/autotest_common.sh@10 -- # set +x 00:05:51.053 ************************************ 00:05:51.053 END TEST json_config_extra_key 00:05:51.053 ************************************ 00:05:51.312 10:37:07 -- spdk/autotest.sh@180 -- # run_test alias_rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:05:51.312 10:37:07 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:51.312 10:37:07 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:51.312 10:37:07 -- common/autotest_common.sh@10 -- # set +x 00:05:51.312 ************************************ 00:05:51.312 START TEST alias_rpc 00:05:51.312 ************************************ 00:05:51.312 10:37:07 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:05:51.312 * Looking for test storage... 00:05:51.312 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc 00:05:51.312 10:37:07 -- alias_rpc/alias_rpc.sh@10 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:05:51.312 10:37:07 -- alias_rpc/alias_rpc.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:05:51.312 10:37:07 -- alias_rpc/alias_rpc.sh@13 -- # spdk_tgt_pid=1967991 00:05:51.312 10:37:07 -- alias_rpc/alias_rpc.sh@14 -- # waitforlisten 1967991 00:05:51.312 10:37:07 -- common/autotest_common.sh@819 -- # '[' -z 1967991 ']' 00:05:51.312 10:37:07 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:51.312 10:37:07 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:51.312 10:37:07 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:51.312 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:51.312 10:37:07 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:51.312 10:37:07 -- common/autotest_common.sh@10 -- # set +x 00:05:51.312 [2024-07-13 10:37:07.592161] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:05:51.312 [2024-07-13 10:37:07.592217] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1967991 ] 00:05:51.312 EAL: No free 2048 kB hugepages reported on node 1 00:05:51.312 [2024-07-13 10:37:07.659025] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:51.313 [2024-07-13 10:37:07.697723] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:51.313 [2024-07-13 10:37:07.697833] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:52.250 10:37:08 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:52.250 10:37:08 -- common/autotest_common.sh@852 -- # return 0 00:05:52.250 10:37:08 -- alias_rpc/alias_rpc.sh@17 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py load_config -i 00:05:52.250 10:37:08 -- alias_rpc/alias_rpc.sh@19 -- # killprocess 1967991 00:05:52.250 10:37:08 -- common/autotest_common.sh@926 -- # '[' -z 1967991 ']' 00:05:52.250 10:37:08 -- common/autotest_common.sh@930 -- # kill -0 1967991 00:05:52.250 10:37:08 -- common/autotest_common.sh@931 -- # uname 00:05:52.250 10:37:08 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:05:52.250 10:37:08 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 1967991 00:05:52.508 10:37:08 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:05:52.508 10:37:08 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:05:52.508 10:37:08 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 1967991' 00:05:52.508 killing process with pid 1967991 00:05:52.508 10:37:08 -- common/autotest_common.sh@945 -- # kill 1967991 00:05:52.508 10:37:08 -- common/autotest_common.sh@950 -- # wait 1967991 00:05:52.767 00:05:52.767 real 0m1.466s 00:05:52.767 user 0m1.578s 00:05:52.767 sys 0m0.422s 00:05:52.767 10:37:08 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:52.767 10:37:08 -- common/autotest_common.sh@10 -- # set +x 00:05:52.767 ************************************ 00:05:52.767 END TEST alias_rpc 00:05:52.767 ************************************ 00:05:52.767 10:37:08 -- spdk/autotest.sh@182 -- # [[ 0 -eq 0 ]] 00:05:52.767 10:37:08 -- spdk/autotest.sh@183 -- # run_test spdkcli_tcp /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/tcp.sh 00:05:52.767 10:37:08 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:52.767 10:37:08 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:52.767 10:37:08 -- common/autotest_common.sh@10 -- # set +x 00:05:52.767 ************************************ 00:05:52.767 START TEST spdkcli_tcp 00:05:52.767 ************************************ 00:05:52.767 10:37:08 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/tcp.sh 00:05:52.767 * Looking for test storage... 00:05:52.767 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli 00:05:52.767 10:37:09 -- spdkcli/tcp.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/common.sh 00:05:52.767 10:37:09 -- spdkcli/common.sh@6 -- # spdkcli_job=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/spdkcli_job.py 00:05:52.767 10:37:09 -- spdkcli/common.sh@7 -- # spdk_clear_config_py=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/clear_config.py 00:05:52.767 10:37:09 -- spdkcli/tcp.sh@18 -- # IP_ADDRESS=127.0.0.1 00:05:52.767 10:37:09 -- spdkcli/tcp.sh@19 -- # PORT=9998 00:05:52.767 10:37:09 -- spdkcli/tcp.sh@21 -- # trap 'err_cleanup; exit 1' SIGINT SIGTERM EXIT 00:05:52.767 10:37:09 -- spdkcli/tcp.sh@23 -- # timing_enter run_spdk_tgt_tcp 00:05:52.767 10:37:09 -- common/autotest_common.sh@712 -- # xtrace_disable 00:05:52.767 10:37:09 -- common/autotest_common.sh@10 -- # set +x 00:05:52.767 10:37:09 -- spdkcli/tcp.sh@25 -- # spdk_tgt_pid=1968311 00:05:52.767 10:37:09 -- spdkcli/tcp.sh@24 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x3 -p 0 00:05:52.767 10:37:09 -- spdkcli/tcp.sh@27 -- # waitforlisten 1968311 00:05:52.767 10:37:09 -- common/autotest_common.sh@819 -- # '[' -z 1968311 ']' 00:05:52.767 10:37:09 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:52.767 10:37:09 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:52.767 10:37:09 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:52.767 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:52.767 10:37:09 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:52.767 10:37:09 -- common/autotest_common.sh@10 -- # set +x 00:05:52.767 [2024-07-13 10:37:09.125900] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:05:52.767 [2024-07-13 10:37:09.125989] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1968311 ] 00:05:53.026 EAL: No free 2048 kB hugepages reported on node 1 00:05:53.026 [2024-07-13 10:37:09.196537] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:05:53.026 [2024-07-13 10:37:09.234971] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:53.026 [2024-07-13 10:37:09.235111] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:53.026 [2024-07-13 10:37:09.235116] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:53.595 10:37:09 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:53.595 10:37:09 -- common/autotest_common.sh@852 -- # return 0 00:05:53.595 10:37:09 -- spdkcli/tcp.sh@30 -- # socat TCP-LISTEN:9998 UNIX-CONNECT:/var/tmp/spdk.sock 00:05:53.595 10:37:09 -- spdkcli/tcp.sh@31 -- # socat_pid=1968489 00:05:53.595 10:37:09 -- spdkcli/tcp.sh@33 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -r 100 -t 2 -s 127.0.0.1 -p 9998 rpc_get_methods 00:05:53.855 [ 00:05:53.855 "spdk_get_version", 00:05:53.855 "rpc_get_methods", 00:05:53.855 "trace_get_info", 00:05:53.855 "trace_get_tpoint_group_mask", 00:05:53.855 "trace_disable_tpoint_group", 00:05:53.855 "trace_enable_tpoint_group", 00:05:53.855 "trace_clear_tpoint_mask", 00:05:53.855 "trace_set_tpoint_mask", 00:05:53.855 "vfu_tgt_set_base_path", 00:05:53.855 "framework_get_pci_devices", 00:05:53.855 "framework_get_config", 00:05:53.855 "framework_get_subsystems", 00:05:53.855 "iobuf_get_stats", 00:05:53.855 "iobuf_set_options", 00:05:53.855 "sock_set_default_impl", 00:05:53.855 "sock_impl_set_options", 00:05:53.855 "sock_impl_get_options", 00:05:53.855 "vmd_rescan", 00:05:53.855 "vmd_remove_device", 00:05:53.855 "vmd_enable", 00:05:53.855 "accel_get_stats", 00:05:53.855 "accel_set_options", 00:05:53.855 "accel_set_driver", 00:05:53.855 "accel_crypto_key_destroy", 00:05:53.855 "accel_crypto_keys_get", 00:05:53.855 "accel_crypto_key_create", 00:05:53.855 "accel_assign_opc", 00:05:53.855 "accel_get_module_info", 00:05:53.855 "accel_get_opc_assignments", 00:05:53.855 "notify_get_notifications", 00:05:53.855 "notify_get_types", 00:05:53.855 "bdev_get_histogram", 00:05:53.855 "bdev_enable_histogram", 00:05:53.855 "bdev_set_qos_limit", 00:05:53.855 "bdev_set_qd_sampling_period", 00:05:53.855 "bdev_get_bdevs", 00:05:53.855 "bdev_reset_iostat", 00:05:53.855 "bdev_get_iostat", 00:05:53.855 "bdev_examine", 00:05:53.855 "bdev_wait_for_examine", 00:05:53.855 "bdev_set_options", 00:05:53.855 "scsi_get_devices", 00:05:53.855 "thread_set_cpumask", 00:05:53.855 "framework_get_scheduler", 00:05:53.855 "framework_set_scheduler", 00:05:53.855 "framework_get_reactors", 00:05:53.855 "thread_get_io_channels", 00:05:53.855 "thread_get_pollers", 00:05:53.855 "thread_get_stats", 00:05:53.855 "framework_monitor_context_switch", 00:05:53.855 "spdk_kill_instance", 00:05:53.855 "log_enable_timestamps", 00:05:53.855 "log_get_flags", 00:05:53.855 "log_clear_flag", 00:05:53.855 "log_set_flag", 00:05:53.855 "log_get_level", 00:05:53.855 "log_set_level", 00:05:53.855 "log_get_print_level", 00:05:53.855 "log_set_print_level", 00:05:53.855 "framework_enable_cpumask_locks", 00:05:53.855 "framework_disable_cpumask_locks", 00:05:53.855 "framework_wait_init", 00:05:53.855 "framework_start_init", 00:05:53.855 "virtio_blk_create_transport", 00:05:53.855 "virtio_blk_get_transports", 00:05:53.855 "vhost_controller_set_coalescing", 00:05:53.855 "vhost_get_controllers", 00:05:53.855 "vhost_delete_controller", 00:05:53.855 "vhost_create_blk_controller", 00:05:53.855 "vhost_scsi_controller_remove_target", 00:05:53.855 "vhost_scsi_controller_add_target", 00:05:53.855 "vhost_start_scsi_controller", 00:05:53.855 "vhost_create_scsi_controller", 00:05:53.855 "ublk_recover_disk", 00:05:53.855 "ublk_get_disks", 00:05:53.855 "ublk_stop_disk", 00:05:53.855 "ublk_start_disk", 00:05:53.855 "ublk_destroy_target", 00:05:53.855 "ublk_create_target", 00:05:53.855 "nbd_get_disks", 00:05:53.855 "nbd_stop_disk", 00:05:53.855 "nbd_start_disk", 00:05:53.855 "env_dpdk_get_mem_stats", 00:05:53.855 "nvmf_subsystem_get_listeners", 00:05:53.855 "nvmf_subsystem_get_qpairs", 00:05:53.855 "nvmf_subsystem_get_controllers", 00:05:53.855 "nvmf_get_stats", 00:05:53.855 "nvmf_get_transports", 00:05:53.855 "nvmf_create_transport", 00:05:53.855 "nvmf_get_targets", 00:05:53.855 "nvmf_delete_target", 00:05:53.855 "nvmf_create_target", 00:05:53.855 "nvmf_subsystem_allow_any_host", 00:05:53.855 "nvmf_subsystem_remove_host", 00:05:53.855 "nvmf_subsystem_add_host", 00:05:53.855 "nvmf_subsystem_remove_ns", 00:05:53.855 "nvmf_subsystem_add_ns", 00:05:53.855 "nvmf_subsystem_listener_set_ana_state", 00:05:53.855 "nvmf_discovery_get_referrals", 00:05:53.855 "nvmf_discovery_remove_referral", 00:05:53.855 "nvmf_discovery_add_referral", 00:05:53.855 "nvmf_subsystem_remove_listener", 00:05:53.855 "nvmf_subsystem_add_listener", 00:05:53.855 "nvmf_delete_subsystem", 00:05:53.855 "nvmf_create_subsystem", 00:05:53.855 "nvmf_get_subsystems", 00:05:53.855 "nvmf_set_crdt", 00:05:53.855 "nvmf_set_config", 00:05:53.855 "nvmf_set_max_subsystems", 00:05:53.855 "iscsi_set_options", 00:05:53.855 "iscsi_get_auth_groups", 00:05:53.855 "iscsi_auth_group_remove_secret", 00:05:53.855 "iscsi_auth_group_add_secret", 00:05:53.855 "iscsi_delete_auth_group", 00:05:53.855 "iscsi_create_auth_group", 00:05:53.855 "iscsi_set_discovery_auth", 00:05:53.855 "iscsi_get_options", 00:05:53.855 "iscsi_target_node_request_logout", 00:05:53.855 "iscsi_target_node_set_redirect", 00:05:53.855 "iscsi_target_node_set_auth", 00:05:53.855 "iscsi_target_node_add_lun", 00:05:53.855 "iscsi_get_connections", 00:05:53.855 "iscsi_portal_group_set_auth", 00:05:53.855 "iscsi_start_portal_group", 00:05:53.855 "iscsi_delete_portal_group", 00:05:53.855 "iscsi_create_portal_group", 00:05:53.855 "iscsi_get_portal_groups", 00:05:53.855 "iscsi_delete_target_node", 00:05:53.855 "iscsi_target_node_remove_pg_ig_maps", 00:05:53.855 "iscsi_target_node_add_pg_ig_maps", 00:05:53.855 "iscsi_create_target_node", 00:05:53.855 "iscsi_get_target_nodes", 00:05:53.855 "iscsi_delete_initiator_group", 00:05:53.855 "iscsi_initiator_group_remove_initiators", 00:05:53.855 "iscsi_initiator_group_add_initiators", 00:05:53.855 "iscsi_create_initiator_group", 00:05:53.855 "iscsi_get_initiator_groups", 00:05:53.855 "vfu_virtio_create_scsi_endpoint", 00:05:53.855 "vfu_virtio_scsi_remove_target", 00:05:53.855 "vfu_virtio_scsi_add_target", 00:05:53.855 "vfu_virtio_create_blk_endpoint", 00:05:53.855 "vfu_virtio_delete_endpoint", 00:05:53.855 "iaa_scan_accel_module", 00:05:53.855 "dsa_scan_accel_module", 00:05:53.855 "ioat_scan_accel_module", 00:05:53.855 "accel_error_inject_error", 00:05:53.855 "bdev_iscsi_delete", 00:05:53.855 "bdev_iscsi_create", 00:05:53.855 "bdev_iscsi_set_options", 00:05:53.855 "bdev_virtio_attach_controller", 00:05:53.855 "bdev_virtio_scsi_get_devices", 00:05:53.855 "bdev_virtio_detach_controller", 00:05:53.855 "bdev_virtio_blk_set_hotplug", 00:05:53.855 "bdev_ftl_set_property", 00:05:53.855 "bdev_ftl_get_properties", 00:05:53.856 "bdev_ftl_get_stats", 00:05:53.856 "bdev_ftl_unmap", 00:05:53.856 "bdev_ftl_unload", 00:05:53.856 "bdev_ftl_delete", 00:05:53.856 "bdev_ftl_load", 00:05:53.856 "bdev_ftl_create", 00:05:53.856 "bdev_aio_delete", 00:05:53.856 "bdev_aio_rescan", 00:05:53.856 "bdev_aio_create", 00:05:53.856 "blobfs_create", 00:05:53.856 "blobfs_detect", 00:05:53.856 "blobfs_set_cache_size", 00:05:53.856 "bdev_zone_block_delete", 00:05:53.856 "bdev_zone_block_create", 00:05:53.856 "bdev_delay_delete", 00:05:53.856 "bdev_delay_create", 00:05:53.856 "bdev_delay_update_latency", 00:05:53.856 "bdev_split_delete", 00:05:53.856 "bdev_split_create", 00:05:53.856 "bdev_error_inject_error", 00:05:53.856 "bdev_error_delete", 00:05:53.856 "bdev_error_create", 00:05:53.856 "bdev_raid_set_options", 00:05:53.856 "bdev_raid_remove_base_bdev", 00:05:53.856 "bdev_raid_add_base_bdev", 00:05:53.856 "bdev_raid_delete", 00:05:53.856 "bdev_raid_create", 00:05:53.856 "bdev_raid_get_bdevs", 00:05:53.856 "bdev_lvol_grow_lvstore", 00:05:53.856 "bdev_lvol_get_lvols", 00:05:53.856 "bdev_lvol_get_lvstores", 00:05:53.856 "bdev_lvol_delete", 00:05:53.856 "bdev_lvol_set_read_only", 00:05:53.856 "bdev_lvol_resize", 00:05:53.856 "bdev_lvol_decouple_parent", 00:05:53.856 "bdev_lvol_inflate", 00:05:53.856 "bdev_lvol_rename", 00:05:53.856 "bdev_lvol_clone_bdev", 00:05:53.856 "bdev_lvol_clone", 00:05:53.856 "bdev_lvol_snapshot", 00:05:53.856 "bdev_lvol_create", 00:05:53.856 "bdev_lvol_delete_lvstore", 00:05:53.856 "bdev_lvol_rename_lvstore", 00:05:53.856 "bdev_lvol_create_lvstore", 00:05:53.856 "bdev_passthru_delete", 00:05:53.856 "bdev_passthru_create", 00:05:53.856 "bdev_nvme_cuse_unregister", 00:05:53.856 "bdev_nvme_cuse_register", 00:05:53.856 "bdev_opal_new_user", 00:05:53.856 "bdev_opal_set_lock_state", 00:05:53.856 "bdev_opal_delete", 00:05:53.856 "bdev_opal_get_info", 00:05:53.856 "bdev_opal_create", 00:05:53.856 "bdev_nvme_opal_revert", 00:05:53.856 "bdev_nvme_opal_init", 00:05:53.856 "bdev_nvme_send_cmd", 00:05:53.856 "bdev_nvme_get_path_iostat", 00:05:53.856 "bdev_nvme_get_mdns_discovery_info", 00:05:53.856 "bdev_nvme_stop_mdns_discovery", 00:05:53.856 "bdev_nvme_start_mdns_discovery", 00:05:53.856 "bdev_nvme_set_multipath_policy", 00:05:53.856 "bdev_nvme_set_preferred_path", 00:05:53.856 "bdev_nvme_get_io_paths", 00:05:53.856 "bdev_nvme_remove_error_injection", 00:05:53.856 "bdev_nvme_add_error_injection", 00:05:53.856 "bdev_nvme_get_discovery_info", 00:05:53.856 "bdev_nvme_stop_discovery", 00:05:53.856 "bdev_nvme_start_discovery", 00:05:53.856 "bdev_nvme_get_controller_health_info", 00:05:53.856 "bdev_nvme_disable_controller", 00:05:53.856 "bdev_nvme_enable_controller", 00:05:53.856 "bdev_nvme_reset_controller", 00:05:53.856 "bdev_nvme_get_transport_statistics", 00:05:53.856 "bdev_nvme_apply_firmware", 00:05:53.856 "bdev_nvme_detach_controller", 00:05:53.856 "bdev_nvme_get_controllers", 00:05:53.856 "bdev_nvme_attach_controller", 00:05:53.856 "bdev_nvme_set_hotplug", 00:05:53.856 "bdev_nvme_set_options", 00:05:53.856 "bdev_null_resize", 00:05:53.856 "bdev_null_delete", 00:05:53.856 "bdev_null_create", 00:05:53.856 "bdev_malloc_delete", 00:05:53.856 "bdev_malloc_create" 00:05:53.856 ] 00:05:53.856 10:37:10 -- spdkcli/tcp.sh@35 -- # timing_exit run_spdk_tgt_tcp 00:05:53.856 10:37:10 -- common/autotest_common.sh@718 -- # xtrace_disable 00:05:53.856 10:37:10 -- common/autotest_common.sh@10 -- # set +x 00:05:53.856 10:37:10 -- spdkcli/tcp.sh@37 -- # trap - SIGINT SIGTERM EXIT 00:05:53.856 10:37:10 -- spdkcli/tcp.sh@38 -- # killprocess 1968311 00:05:53.856 10:37:10 -- common/autotest_common.sh@926 -- # '[' -z 1968311 ']' 00:05:53.856 10:37:10 -- common/autotest_common.sh@930 -- # kill -0 1968311 00:05:53.856 10:37:10 -- common/autotest_common.sh@931 -- # uname 00:05:53.856 10:37:10 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:05:53.856 10:37:10 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 1968311 00:05:53.856 10:37:10 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:05:53.856 10:37:10 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:05:53.856 10:37:10 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 1968311' 00:05:53.856 killing process with pid 1968311 00:05:53.856 10:37:10 -- common/autotest_common.sh@945 -- # kill 1968311 00:05:53.856 10:37:10 -- common/autotest_common.sh@950 -- # wait 1968311 00:05:54.115 00:05:54.115 real 0m1.501s 00:05:54.115 user 0m2.769s 00:05:54.115 sys 0m0.508s 00:05:54.115 10:37:10 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:54.115 10:37:10 -- common/autotest_common.sh@10 -- # set +x 00:05:54.115 ************************************ 00:05:54.115 END TEST spdkcli_tcp 00:05:54.115 ************************************ 00:05:54.374 10:37:10 -- spdk/autotest.sh@186 -- # run_test dpdk_mem_utility /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:05:54.374 10:37:10 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:54.374 10:37:10 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:54.374 10:37:10 -- common/autotest_common.sh@10 -- # set +x 00:05:54.374 ************************************ 00:05:54.374 START TEST dpdk_mem_utility 00:05:54.374 ************************************ 00:05:54.374 10:37:10 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:05:54.374 * Looking for test storage... 00:05:54.374 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility 00:05:54.375 10:37:10 -- dpdk_memory_utility/test_dpdk_mem_info.sh@10 -- # MEM_SCRIPT=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:05:54.375 10:37:10 -- dpdk_memory_utility/test_dpdk_mem_info.sh@13 -- # spdkpid=1968639 00:05:54.375 10:37:10 -- dpdk_memory_utility/test_dpdk_mem_info.sh@15 -- # waitforlisten 1968639 00:05:54.375 10:37:10 -- dpdk_memory_utility/test_dpdk_mem_info.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:05:54.375 10:37:10 -- common/autotest_common.sh@819 -- # '[' -z 1968639 ']' 00:05:54.375 10:37:10 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:54.375 10:37:10 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:54.375 10:37:10 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:54.375 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:54.375 10:37:10 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:54.375 10:37:10 -- common/autotest_common.sh@10 -- # set +x 00:05:54.375 [2024-07-13 10:37:10.669484] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:05:54.375 [2024-07-13 10:37:10.669558] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1968639 ] 00:05:54.375 EAL: No free 2048 kB hugepages reported on node 1 00:05:54.375 [2024-07-13 10:37:10.738291] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:54.637 [2024-07-13 10:37:10.775889] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:54.637 [2024-07-13 10:37:10.775996] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:55.300 10:37:11 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:55.300 10:37:11 -- common/autotest_common.sh@852 -- # return 0 00:05:55.300 10:37:11 -- dpdk_memory_utility/test_dpdk_mem_info.sh@17 -- # trap 'killprocess $spdkpid' SIGINT SIGTERM EXIT 00:05:55.300 10:37:11 -- dpdk_memory_utility/test_dpdk_mem_info.sh@19 -- # rpc_cmd env_dpdk_get_mem_stats 00:05:55.300 10:37:11 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:55.300 10:37:11 -- common/autotest_common.sh@10 -- # set +x 00:05:55.300 { 00:05:55.300 "filename": "/tmp/spdk_mem_dump.txt" 00:05:55.300 } 00:05:55.300 10:37:11 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:55.300 10:37:11 -- dpdk_memory_utility/test_dpdk_mem_info.sh@21 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:05:55.300 DPDK memory size 814.000000 MiB in 1 heap(s) 00:05:55.300 1 heaps totaling size 814.000000 MiB 00:05:55.300 size: 814.000000 MiB heap id: 0 00:05:55.300 end heaps---------- 00:05:55.300 8 mempools totaling size 598.116089 MiB 00:05:55.300 size: 212.674988 MiB name: PDU_immediate_data_Pool 00:05:55.300 size: 158.602051 MiB name: PDU_data_out_Pool 00:05:55.300 size: 84.521057 MiB name: bdev_io_1968639 00:05:55.300 size: 51.011292 MiB name: evtpool_1968639 00:05:55.300 size: 50.003479 MiB name: msgpool_1968639 00:05:55.300 size: 21.763794 MiB name: PDU_Pool 00:05:55.300 size: 19.513306 MiB name: SCSI_TASK_Pool 00:05:55.300 size: 0.026123 MiB name: Session_Pool 00:05:55.300 end mempools------- 00:05:55.300 6 memzones totaling size 4.142822 MiB 00:05:55.300 size: 1.000366 MiB name: RG_ring_0_1968639 00:05:55.300 size: 1.000366 MiB name: RG_ring_1_1968639 00:05:55.300 size: 1.000366 MiB name: RG_ring_4_1968639 00:05:55.300 size: 1.000366 MiB name: RG_ring_5_1968639 00:05:55.300 size: 0.125366 MiB name: RG_ring_2_1968639 00:05:55.300 size: 0.015991 MiB name: RG_ring_3_1968639 00:05:55.300 end memzones------- 00:05:55.300 10:37:11 -- dpdk_memory_utility/test_dpdk_mem_info.sh@23 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py -m 0 00:05:55.300 heap id: 0 total size: 814.000000 MiB number of busy elements: 41 number of free elements: 15 00:05:55.300 list of free elements. size: 12.519348 MiB 00:05:55.300 element at address: 0x200000400000 with size: 1.999512 MiB 00:05:55.300 element at address: 0x200018e00000 with size: 0.999878 MiB 00:05:55.300 element at address: 0x200019000000 with size: 0.999878 MiB 00:05:55.300 element at address: 0x200003e00000 with size: 0.996277 MiB 00:05:55.300 element at address: 0x200031c00000 with size: 0.994446 MiB 00:05:55.300 element at address: 0x200013800000 with size: 0.978699 MiB 00:05:55.300 element at address: 0x200007000000 with size: 0.959839 MiB 00:05:55.300 element at address: 0x200019200000 with size: 0.936584 MiB 00:05:55.300 element at address: 0x200000200000 with size: 0.841614 MiB 00:05:55.300 element at address: 0x20001aa00000 with size: 0.582886 MiB 00:05:55.300 element at address: 0x20000b200000 with size: 0.490723 MiB 00:05:55.300 element at address: 0x200000800000 with size: 0.487793 MiB 00:05:55.300 element at address: 0x200019400000 with size: 0.485657 MiB 00:05:55.300 element at address: 0x200027e00000 with size: 0.410034 MiB 00:05:55.300 element at address: 0x200003a00000 with size: 0.355530 MiB 00:05:55.300 list of standard malloc elements. size: 199.218079 MiB 00:05:55.300 element at address: 0x20000b3fff80 with size: 132.000122 MiB 00:05:55.300 element at address: 0x2000071fff80 with size: 64.000122 MiB 00:05:55.300 element at address: 0x200018efff80 with size: 1.000122 MiB 00:05:55.300 element at address: 0x2000190fff80 with size: 1.000122 MiB 00:05:55.300 element at address: 0x2000192fff80 with size: 1.000122 MiB 00:05:55.300 element at address: 0x2000003d9f00 with size: 0.140747 MiB 00:05:55.300 element at address: 0x2000192eff00 with size: 0.062622 MiB 00:05:55.300 element at address: 0x2000003fdf80 with size: 0.007935 MiB 00:05:55.300 element at address: 0x2000192efdc0 with size: 0.000305 MiB 00:05:55.300 element at address: 0x2000002d7740 with size: 0.000183 MiB 00:05:55.300 element at address: 0x2000002d7800 with size: 0.000183 MiB 00:05:55.300 element at address: 0x2000002d78c0 with size: 0.000183 MiB 00:05:55.300 element at address: 0x2000002d7ac0 with size: 0.000183 MiB 00:05:55.300 element at address: 0x2000002d7b80 with size: 0.000183 MiB 00:05:55.300 element at address: 0x2000002d7c40 with size: 0.000183 MiB 00:05:55.300 element at address: 0x2000003d9e40 with size: 0.000183 MiB 00:05:55.300 element at address: 0x20000087ce00 with size: 0.000183 MiB 00:05:55.300 element at address: 0x20000087cec0 with size: 0.000183 MiB 00:05:55.300 element at address: 0x2000008fd180 with size: 0.000183 MiB 00:05:55.300 element at address: 0x200003a5b040 with size: 0.000183 MiB 00:05:55.300 element at address: 0x200003adb300 with size: 0.000183 MiB 00:05:55.300 element at address: 0x200003adb500 with size: 0.000183 MiB 00:05:55.300 element at address: 0x200003adf7c0 with size: 0.000183 MiB 00:05:55.300 element at address: 0x200003affa80 with size: 0.000183 MiB 00:05:55.300 element at address: 0x200003affb40 with size: 0.000183 MiB 00:05:55.300 element at address: 0x200003eff0c0 with size: 0.000183 MiB 00:05:55.300 element at address: 0x2000070fdd80 with size: 0.000183 MiB 00:05:55.300 element at address: 0x20000b27da00 with size: 0.000183 MiB 00:05:55.300 element at address: 0x20000b27dac0 with size: 0.000183 MiB 00:05:55.300 element at address: 0x20000b2fdd80 with size: 0.000183 MiB 00:05:55.300 element at address: 0x2000138fa8c0 with size: 0.000183 MiB 00:05:55.300 element at address: 0x2000192efc40 with size: 0.000183 MiB 00:05:55.300 element at address: 0x2000192efd00 with size: 0.000183 MiB 00:05:55.300 element at address: 0x2000194bc740 with size: 0.000183 MiB 00:05:55.300 element at address: 0x20001aa95380 with size: 0.000183 MiB 00:05:55.300 element at address: 0x20001aa95440 with size: 0.000183 MiB 00:05:55.300 element at address: 0x200027e68f80 with size: 0.000183 MiB 00:05:55.300 element at address: 0x200027e69040 with size: 0.000183 MiB 00:05:55.300 element at address: 0x200027e6fc40 with size: 0.000183 MiB 00:05:55.300 element at address: 0x200027e6fe40 with size: 0.000183 MiB 00:05:55.300 element at address: 0x200027e6ff00 with size: 0.000183 MiB 00:05:55.300 list of memzone associated elements. size: 602.262573 MiB 00:05:55.300 element at address: 0x20001aa95500 with size: 211.416748 MiB 00:05:55.300 associated memzone info: size: 211.416626 MiB name: MP_PDU_immediate_data_Pool_0 00:05:55.300 element at address: 0x200027e6ffc0 with size: 157.562561 MiB 00:05:55.300 associated memzone info: size: 157.562439 MiB name: MP_PDU_data_out_Pool_0 00:05:55.300 element at address: 0x2000139fab80 with size: 84.020630 MiB 00:05:55.300 associated memzone info: size: 84.020508 MiB name: MP_bdev_io_1968639_0 00:05:55.300 element at address: 0x2000009ff380 with size: 48.003052 MiB 00:05:55.300 associated memzone info: size: 48.002930 MiB name: MP_evtpool_1968639_0 00:05:55.300 element at address: 0x200003fff380 with size: 48.003052 MiB 00:05:55.300 associated memzone info: size: 48.002930 MiB name: MP_msgpool_1968639_0 00:05:55.300 element at address: 0x2000195be940 with size: 20.255554 MiB 00:05:55.300 associated memzone info: size: 20.255432 MiB name: MP_PDU_Pool_0 00:05:55.300 element at address: 0x200031dfeb40 with size: 18.005066 MiB 00:05:55.300 associated memzone info: size: 18.004944 MiB name: MP_SCSI_TASK_Pool_0 00:05:55.300 element at address: 0x2000005ffe00 with size: 2.000488 MiB 00:05:55.300 associated memzone info: size: 2.000366 MiB name: RG_MP_evtpool_1968639 00:05:55.300 element at address: 0x200003bffe00 with size: 2.000488 MiB 00:05:55.300 associated memzone info: size: 2.000366 MiB name: RG_MP_msgpool_1968639 00:05:55.300 element at address: 0x2000002d7d00 with size: 1.008118 MiB 00:05:55.300 associated memzone info: size: 1.007996 MiB name: MP_evtpool_1968639 00:05:55.300 element at address: 0x20000b2fde40 with size: 1.008118 MiB 00:05:55.300 associated memzone info: size: 1.007996 MiB name: MP_PDU_Pool 00:05:55.300 element at address: 0x2000194bc800 with size: 1.008118 MiB 00:05:55.300 associated memzone info: size: 1.007996 MiB name: MP_PDU_immediate_data_Pool 00:05:55.300 element at address: 0x2000070fde40 with size: 1.008118 MiB 00:05:55.300 associated memzone info: size: 1.007996 MiB name: MP_PDU_data_out_Pool 00:05:55.300 element at address: 0x2000008fd240 with size: 1.008118 MiB 00:05:55.300 associated memzone info: size: 1.007996 MiB name: MP_SCSI_TASK_Pool 00:05:55.300 element at address: 0x200003eff180 with size: 1.000488 MiB 00:05:55.300 associated memzone info: size: 1.000366 MiB name: RG_ring_0_1968639 00:05:55.300 element at address: 0x200003affc00 with size: 1.000488 MiB 00:05:55.300 associated memzone info: size: 1.000366 MiB name: RG_ring_1_1968639 00:05:55.300 element at address: 0x2000138fa980 with size: 1.000488 MiB 00:05:55.300 associated memzone info: size: 1.000366 MiB name: RG_ring_4_1968639 00:05:55.300 element at address: 0x200031cfe940 with size: 1.000488 MiB 00:05:55.300 associated memzone info: size: 1.000366 MiB name: RG_ring_5_1968639 00:05:55.300 element at address: 0x200003a5b100 with size: 0.500488 MiB 00:05:55.300 associated memzone info: size: 0.500366 MiB name: RG_MP_bdev_io_1968639 00:05:55.300 element at address: 0x20000b27db80 with size: 0.500488 MiB 00:05:55.300 associated memzone info: size: 0.500366 MiB name: RG_MP_PDU_Pool 00:05:55.300 element at address: 0x20000087cf80 with size: 0.500488 MiB 00:05:55.300 associated memzone info: size: 0.500366 MiB name: RG_MP_SCSI_TASK_Pool 00:05:55.300 element at address: 0x20001947c540 with size: 0.250488 MiB 00:05:55.300 associated memzone info: size: 0.250366 MiB name: RG_MP_PDU_immediate_data_Pool 00:05:55.301 element at address: 0x200003adf880 with size: 0.125488 MiB 00:05:55.301 associated memzone info: size: 0.125366 MiB name: RG_ring_2_1968639 00:05:55.301 element at address: 0x2000070f5b80 with size: 0.031738 MiB 00:05:55.301 associated memzone info: size: 0.031616 MiB name: RG_MP_PDU_data_out_Pool 00:05:55.301 element at address: 0x200027e69100 with size: 0.023743 MiB 00:05:55.301 associated memzone info: size: 0.023621 MiB name: MP_Session_Pool_0 00:05:55.301 element at address: 0x200003adb5c0 with size: 0.016113 MiB 00:05:55.301 associated memzone info: size: 0.015991 MiB name: RG_ring_3_1968639 00:05:55.301 element at address: 0x200027e6f240 with size: 0.002441 MiB 00:05:55.301 associated memzone info: size: 0.002319 MiB name: RG_MP_Session_Pool 00:05:55.301 element at address: 0x2000002d7980 with size: 0.000305 MiB 00:05:55.301 associated memzone info: size: 0.000183 MiB name: MP_msgpool_1968639 00:05:55.301 element at address: 0x200003adb3c0 with size: 0.000305 MiB 00:05:55.301 associated memzone info: size: 0.000183 MiB name: MP_bdev_io_1968639 00:05:55.301 element at address: 0x200027e6fd00 with size: 0.000305 MiB 00:05:55.301 associated memzone info: size: 0.000183 MiB name: MP_Session_Pool 00:05:55.301 10:37:11 -- dpdk_memory_utility/test_dpdk_mem_info.sh@25 -- # trap - SIGINT SIGTERM EXIT 00:05:55.301 10:37:11 -- dpdk_memory_utility/test_dpdk_mem_info.sh@26 -- # killprocess 1968639 00:05:55.301 10:37:11 -- common/autotest_common.sh@926 -- # '[' -z 1968639 ']' 00:05:55.301 10:37:11 -- common/autotest_common.sh@930 -- # kill -0 1968639 00:05:55.301 10:37:11 -- common/autotest_common.sh@931 -- # uname 00:05:55.301 10:37:11 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:05:55.301 10:37:11 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 1968639 00:05:55.301 10:37:11 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:05:55.301 10:37:11 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:05:55.301 10:37:11 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 1968639' 00:05:55.301 killing process with pid 1968639 00:05:55.301 10:37:11 -- common/autotest_common.sh@945 -- # kill 1968639 00:05:55.301 10:37:11 -- common/autotest_common.sh@950 -- # wait 1968639 00:05:55.559 00:05:55.559 real 0m1.370s 00:05:55.559 user 0m1.391s 00:05:55.559 sys 0m0.430s 00:05:55.559 10:37:11 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:55.559 10:37:11 -- common/autotest_common.sh@10 -- # set +x 00:05:55.559 ************************************ 00:05:55.559 END TEST dpdk_mem_utility 00:05:55.559 ************************************ 00:05:55.817 10:37:11 -- spdk/autotest.sh@187 -- # run_test event /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event.sh 00:05:55.817 10:37:11 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:55.817 10:37:11 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:55.817 10:37:11 -- common/autotest_common.sh@10 -- # set +x 00:05:55.817 ************************************ 00:05:55.817 START TEST event 00:05:55.817 ************************************ 00:05:55.817 10:37:11 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event.sh 00:05:55.817 * Looking for test storage... 00:05:55.817 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event 00:05:55.817 10:37:12 -- event/event.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/bdev/nbd_common.sh 00:05:55.817 10:37:12 -- bdev/nbd_common.sh@6 -- # set -e 00:05:55.817 10:37:12 -- event/event.sh@45 -- # run_test event_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:05:55.817 10:37:12 -- common/autotest_common.sh@1077 -- # '[' 6 -le 1 ']' 00:05:55.817 10:37:12 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:55.817 10:37:12 -- common/autotest_common.sh@10 -- # set +x 00:05:55.817 ************************************ 00:05:55.817 START TEST event_perf 00:05:55.817 ************************************ 00:05:55.817 10:37:12 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:05:55.817 Running I/O for 1 seconds...[2024-07-13 10:37:12.068865] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:05:55.817 [2024-07-13 10:37:12.068959] [ DPDK EAL parameters: event_perf --no-shconf -c 0xF --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1968907 ] 00:05:55.817 EAL: No free 2048 kB hugepages reported on node 1 00:05:55.817 [2024-07-13 10:37:12.140962] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:05:55.817 [2024-07-13 10:37:12.180140] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:55.817 [2024-07-13 10:37:12.180236] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:05:55.817 [2024-07-13 10:37:12.180297] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:05:55.817 [2024-07-13 10:37:12.180299] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:57.193 Running I/O for 1 seconds... 00:05:57.193 lcore 0: 196720 00:05:57.193 lcore 1: 196717 00:05:57.193 lcore 2: 196719 00:05:57.193 lcore 3: 196719 00:05:57.193 done. 00:05:57.193 00:05:57.193 real 0m1.182s 00:05:57.193 user 0m4.087s 00:05:57.193 sys 0m0.093s 00:05:57.193 10:37:13 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:57.193 10:37:13 -- common/autotest_common.sh@10 -- # set +x 00:05:57.193 ************************************ 00:05:57.193 END TEST event_perf 00:05:57.193 ************************************ 00:05:57.193 10:37:13 -- event/event.sh@46 -- # run_test event_reactor /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:05:57.193 10:37:13 -- common/autotest_common.sh@1077 -- # '[' 4 -le 1 ']' 00:05:57.193 10:37:13 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:57.193 10:37:13 -- common/autotest_common.sh@10 -- # set +x 00:05:57.193 ************************************ 00:05:57.193 START TEST event_reactor 00:05:57.193 ************************************ 00:05:57.193 10:37:13 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:05:57.193 [2024-07-13 10:37:13.291814] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:05:57.193 [2024-07-13 10:37:13.291904] [ DPDK EAL parameters: reactor --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1969175 ] 00:05:57.193 EAL: No free 2048 kB hugepages reported on node 1 00:05:57.193 [2024-07-13 10:37:13.361011] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:57.193 [2024-07-13 10:37:13.395728] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:58.128 test_start 00:05:58.128 oneshot 00:05:58.128 tick 100 00:05:58.128 tick 100 00:05:58.128 tick 250 00:05:58.128 tick 100 00:05:58.128 tick 100 00:05:58.128 tick 100 00:05:58.128 tick 250 00:05:58.128 tick 500 00:05:58.128 tick 100 00:05:58.128 tick 100 00:05:58.128 tick 250 00:05:58.128 tick 100 00:05:58.128 tick 100 00:05:58.128 test_end 00:05:58.128 00:05:58.128 real 0m1.177s 00:05:58.128 user 0m1.089s 00:05:58.128 sys 0m0.084s 00:05:58.128 10:37:14 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:58.128 10:37:14 -- common/autotest_common.sh@10 -- # set +x 00:05:58.128 ************************************ 00:05:58.128 END TEST event_reactor 00:05:58.128 ************************************ 00:05:58.128 10:37:14 -- event/event.sh@47 -- # run_test event_reactor_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:05:58.128 10:37:14 -- common/autotest_common.sh@1077 -- # '[' 4 -le 1 ']' 00:05:58.128 10:37:14 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:58.128 10:37:14 -- common/autotest_common.sh@10 -- # set +x 00:05:58.128 ************************************ 00:05:58.128 START TEST event_reactor_perf 00:05:58.128 ************************************ 00:05:58.128 10:37:14 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:05:58.128 [2024-07-13 10:37:14.513536] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:05:58.128 [2024-07-13 10:37:14.513625] [ DPDK EAL parameters: reactor_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1969459 ] 00:05:58.387 EAL: No free 2048 kB hugepages reported on node 1 00:05:58.387 [2024-07-13 10:37:14.584120] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:58.387 [2024-07-13 10:37:14.618726] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:59.323 test_start 00:05:59.323 test_end 00:05:59.323 Performance: 903905 events per second 00:05:59.323 00:05:59.323 real 0m1.175s 00:05:59.323 user 0m1.084s 00:05:59.323 sys 0m0.087s 00:05:59.323 10:37:15 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:59.323 10:37:15 -- common/autotest_common.sh@10 -- # set +x 00:05:59.323 ************************************ 00:05:59.323 END TEST event_reactor_perf 00:05:59.323 ************************************ 00:05:59.323 10:37:15 -- event/event.sh@49 -- # uname -s 00:05:59.582 10:37:15 -- event/event.sh@49 -- # '[' Linux = Linux ']' 00:05:59.582 10:37:15 -- event/event.sh@50 -- # run_test event_scheduler /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:05:59.582 10:37:15 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:59.582 10:37:15 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:59.582 10:37:15 -- common/autotest_common.sh@10 -- # set +x 00:05:59.582 ************************************ 00:05:59.582 START TEST event_scheduler 00:05:59.582 ************************************ 00:05:59.582 10:37:15 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:05:59.582 * Looking for test storage... 00:05:59.582 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler 00:05:59.582 10:37:15 -- scheduler/scheduler.sh@29 -- # rpc=rpc_cmd 00:05:59.582 10:37:15 -- scheduler/scheduler.sh@35 -- # scheduler_pid=1969766 00:05:59.582 10:37:15 -- scheduler/scheduler.sh@36 -- # trap 'killprocess $scheduler_pid; exit 1' SIGINT SIGTERM EXIT 00:05:59.582 10:37:15 -- scheduler/scheduler.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler -m 0xF -p 0x2 --wait-for-rpc -f 00:05:59.582 10:37:15 -- scheduler/scheduler.sh@37 -- # waitforlisten 1969766 00:05:59.582 10:37:15 -- common/autotest_common.sh@819 -- # '[' -z 1969766 ']' 00:05:59.582 10:37:15 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:59.582 10:37:15 -- common/autotest_common.sh@824 -- # local max_retries=100 00:05:59.582 10:37:15 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:59.582 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:59.582 10:37:15 -- common/autotest_common.sh@828 -- # xtrace_disable 00:05:59.582 10:37:15 -- common/autotest_common.sh@10 -- # set +x 00:05:59.582 [2024-07-13 10:37:15.843217] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:05:59.582 [2024-07-13 10:37:15.843309] [ DPDK EAL parameters: scheduler --no-shconf -c 0xF --main-lcore=2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1969766 ] 00:05:59.582 EAL: No free 2048 kB hugepages reported on node 1 00:05:59.582 [2024-07-13 10:37:15.909656] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:05:59.582 [2024-07-13 10:37:15.949144] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:59.582 [2024-07-13 10:37:15.949228] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:59.582 [2024-07-13 10:37:15.949309] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:05:59.582 [2024-07-13 10:37:15.949310] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:05:59.841 10:37:15 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:05:59.841 10:37:15 -- common/autotest_common.sh@852 -- # return 0 00:05:59.841 10:37:15 -- scheduler/scheduler.sh@39 -- # rpc_cmd framework_set_scheduler dynamic 00:05:59.841 10:37:15 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:59.841 10:37:15 -- common/autotest_common.sh@10 -- # set +x 00:05:59.841 POWER: Env isn't set yet! 00:05:59.841 POWER: Attempting to initialise ACPI cpufreq power management... 00:05:59.841 POWER: Failed to write /sys/devices/system/cpu/cpu%u/cpufreq/scaling_governor 00:05:59.841 POWER: Cannot set governor of lcore 0 to userspace 00:05:59.841 POWER: Attempting to initialise PSTAT power management... 00:05:59.841 POWER: Power management governor of lcore 0 has been set to 'performance' successfully 00:05:59.841 POWER: Initialized successfully for lcore 0 power management 00:05:59.841 POWER: Power management governor of lcore 1 has been set to 'performance' successfully 00:05:59.841 POWER: Initialized successfully for lcore 1 power management 00:05:59.841 POWER: Power management governor of lcore 2 has been set to 'performance' successfully 00:05:59.841 POWER: Initialized successfully for lcore 2 power management 00:05:59.841 POWER: Power management governor of lcore 3 has been set to 'performance' successfully 00:05:59.841 POWER: Initialized successfully for lcore 3 power management 00:05:59.841 [2024-07-13 10:37:16.038747] scheduler_dynamic.c: 387:set_opts: *NOTICE*: Setting scheduler load limit to 20 00:05:59.841 [2024-07-13 10:37:16.038764] scheduler_dynamic.c: 389:set_opts: *NOTICE*: Setting scheduler core limit to 80 00:05:59.841 [2024-07-13 10:37:16.038774] scheduler_dynamic.c: 391:set_opts: *NOTICE*: Setting scheduler core busy to 95 00:05:59.841 10:37:16 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:59.841 10:37:16 -- scheduler/scheduler.sh@40 -- # rpc_cmd framework_start_init 00:05:59.841 10:37:16 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:59.841 10:37:16 -- common/autotest_common.sh@10 -- # set +x 00:05:59.841 [2024-07-13 10:37:16.101268] scheduler.c: 382:test_start: *NOTICE*: Scheduler test application started. 00:05:59.841 10:37:16 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:59.841 10:37:16 -- scheduler/scheduler.sh@43 -- # run_test scheduler_create_thread scheduler_create_thread 00:05:59.841 10:37:16 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:05:59.841 10:37:16 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:05:59.841 10:37:16 -- common/autotest_common.sh@10 -- # set +x 00:05:59.841 ************************************ 00:05:59.841 START TEST scheduler_create_thread 00:05:59.841 ************************************ 00:05:59.841 10:37:16 -- common/autotest_common.sh@1104 -- # scheduler_create_thread 00:05:59.841 10:37:16 -- scheduler/scheduler.sh@12 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x1 -a 100 00:05:59.841 10:37:16 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:59.841 10:37:16 -- common/autotest_common.sh@10 -- # set +x 00:05:59.841 2 00:05:59.841 10:37:16 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:59.841 10:37:16 -- scheduler/scheduler.sh@13 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x2 -a 100 00:05:59.841 10:37:16 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:59.841 10:37:16 -- common/autotest_common.sh@10 -- # set +x 00:05:59.841 3 00:05:59.841 10:37:16 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:59.841 10:37:16 -- scheduler/scheduler.sh@14 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x4 -a 100 00:05:59.841 10:37:16 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:59.841 10:37:16 -- common/autotest_common.sh@10 -- # set +x 00:05:59.841 4 00:05:59.841 10:37:16 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:59.841 10:37:16 -- scheduler/scheduler.sh@15 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x8 -a 100 00:05:59.841 10:37:16 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:59.841 10:37:16 -- common/autotest_common.sh@10 -- # set +x 00:05:59.841 5 00:05:59.841 10:37:16 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:59.842 10:37:16 -- scheduler/scheduler.sh@16 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x1 -a 0 00:05:59.842 10:37:16 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:59.842 10:37:16 -- common/autotest_common.sh@10 -- # set +x 00:05:59.842 6 00:05:59.842 10:37:16 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:59.842 10:37:16 -- scheduler/scheduler.sh@17 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x2 -a 0 00:05:59.842 10:37:16 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:59.842 10:37:16 -- common/autotest_common.sh@10 -- # set +x 00:05:59.842 7 00:05:59.842 10:37:16 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:59.842 10:37:16 -- scheduler/scheduler.sh@18 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x4 -a 0 00:05:59.842 10:37:16 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:59.842 10:37:16 -- common/autotest_common.sh@10 -- # set +x 00:05:59.842 8 00:05:59.842 10:37:16 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:59.842 10:37:16 -- scheduler/scheduler.sh@19 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x8 -a 0 00:05:59.842 10:37:16 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:59.842 10:37:16 -- common/autotest_common.sh@10 -- # set +x 00:05:59.842 9 00:05:59.842 10:37:16 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:59.842 10:37:16 -- scheduler/scheduler.sh@21 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n one_third_active -a 30 00:05:59.842 10:37:16 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:59.842 10:37:16 -- common/autotest_common.sh@10 -- # set +x 00:05:59.842 10 00:05:59.842 10:37:16 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:59.842 10:37:16 -- scheduler/scheduler.sh@22 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n half_active -a 0 00:05:59.842 10:37:16 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:59.842 10:37:16 -- common/autotest_common.sh@10 -- # set +x 00:05:59.842 10:37:16 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:05:59.842 10:37:16 -- scheduler/scheduler.sh@22 -- # thread_id=11 00:05:59.842 10:37:16 -- scheduler/scheduler.sh@23 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_set_active 11 50 00:05:59.842 10:37:16 -- common/autotest_common.sh@551 -- # xtrace_disable 00:05:59.842 10:37:16 -- common/autotest_common.sh@10 -- # set +x 00:06:00.410 10:37:16 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:06:00.410 10:37:16 -- scheduler/scheduler.sh@25 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n deleted -a 100 00:06:00.410 10:37:16 -- common/autotest_common.sh@551 -- # xtrace_disable 00:06:00.410 10:37:16 -- common/autotest_common.sh@10 -- # set +x 00:06:01.785 10:37:18 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:06:01.785 10:37:18 -- scheduler/scheduler.sh@25 -- # thread_id=12 00:06:01.785 10:37:18 -- scheduler/scheduler.sh@26 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_delete 12 00:06:01.785 10:37:18 -- common/autotest_common.sh@551 -- # xtrace_disable 00:06:01.785 10:37:18 -- common/autotest_common.sh@10 -- # set +x 00:06:03.162 10:37:19 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:06:03.162 00:06:03.162 real 0m3.101s 00:06:03.162 user 0m0.020s 00:06:03.162 sys 0m0.010s 00:06:03.162 10:37:19 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:03.162 10:37:19 -- common/autotest_common.sh@10 -- # set +x 00:06:03.162 ************************************ 00:06:03.162 END TEST scheduler_create_thread 00:06:03.162 ************************************ 00:06:03.162 10:37:19 -- scheduler/scheduler.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:06:03.162 10:37:19 -- scheduler/scheduler.sh@46 -- # killprocess 1969766 00:06:03.162 10:37:19 -- common/autotest_common.sh@926 -- # '[' -z 1969766 ']' 00:06:03.162 10:37:19 -- common/autotest_common.sh@930 -- # kill -0 1969766 00:06:03.162 10:37:19 -- common/autotest_common.sh@931 -- # uname 00:06:03.162 10:37:19 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:06:03.162 10:37:19 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 1969766 00:06:03.162 10:37:19 -- common/autotest_common.sh@932 -- # process_name=reactor_2 00:06:03.162 10:37:19 -- common/autotest_common.sh@936 -- # '[' reactor_2 = sudo ']' 00:06:03.162 10:37:19 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 1969766' 00:06:03.162 killing process with pid 1969766 00:06:03.162 10:37:19 -- common/autotest_common.sh@945 -- # kill 1969766 00:06:03.162 10:37:19 -- common/autotest_common.sh@950 -- # wait 1969766 00:06:03.421 [2024-07-13 10:37:19.592237] scheduler.c: 360:test_shutdown: *NOTICE*: Scheduler test application stopped. 00:06:03.421 POWER: Power management governor of lcore 0 has been set to 'powersave' successfully 00:06:03.421 POWER: Power management of lcore 0 has exited from 'performance' mode and been set back to the original 00:06:03.421 POWER: Power management governor of lcore 1 has been set to 'powersave' successfully 00:06:03.421 POWER: Power management of lcore 1 has exited from 'performance' mode and been set back to the original 00:06:03.421 POWER: Power management governor of lcore 2 has been set to 'powersave' successfully 00:06:03.421 POWER: Power management of lcore 2 has exited from 'performance' mode and been set back to the original 00:06:03.421 POWER: Power management governor of lcore 3 has been set to 'powersave' successfully 00:06:03.421 POWER: Power management of lcore 3 has exited from 'performance' mode and been set back to the original 00:06:03.421 00:06:03.421 real 0m4.066s 00:06:03.421 user 0m6.558s 00:06:03.421 sys 0m0.355s 00:06:03.421 10:37:19 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:03.421 10:37:19 -- common/autotest_common.sh@10 -- # set +x 00:06:03.421 ************************************ 00:06:03.421 END TEST event_scheduler 00:06:03.421 ************************************ 00:06:03.680 10:37:19 -- event/event.sh@51 -- # modprobe -n nbd 00:06:03.680 10:37:19 -- event/event.sh@52 -- # run_test app_repeat app_repeat_test 00:06:03.680 10:37:19 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:06:03.680 10:37:19 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:03.680 10:37:19 -- common/autotest_common.sh@10 -- # set +x 00:06:03.680 ************************************ 00:06:03.680 START TEST app_repeat 00:06:03.680 ************************************ 00:06:03.680 10:37:19 -- common/autotest_common.sh@1104 -- # app_repeat_test 00:06:03.680 10:37:19 -- event/event.sh@12 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:03.680 10:37:19 -- event/event.sh@13 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:03.680 10:37:19 -- event/event.sh@13 -- # local nbd_list 00:06:03.680 10:37:19 -- event/event.sh@14 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:03.680 10:37:19 -- event/event.sh@14 -- # local bdev_list 00:06:03.680 10:37:19 -- event/event.sh@15 -- # local repeat_times=4 00:06:03.680 10:37:19 -- event/event.sh@17 -- # modprobe nbd 00:06:03.680 10:37:19 -- event/event.sh@18 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/app_repeat/app_repeat -r /var/tmp/spdk-nbd.sock -m 0x3 -t 4 00:06:03.680 10:37:19 -- event/event.sh@19 -- # repeat_pid=1970441 00:06:03.680 10:37:19 -- event/event.sh@20 -- # trap 'killprocess $repeat_pid; exit 1' SIGINT SIGTERM EXIT 00:06:03.680 10:37:19 -- event/event.sh@21 -- # echo 'Process app_repeat pid: 1970441' 00:06:03.680 Process app_repeat pid: 1970441 00:06:03.680 10:37:19 -- event/event.sh@23 -- # for i in {0..2} 00:06:03.680 10:37:19 -- event/event.sh@24 -- # echo 'spdk_app_start Round 0' 00:06:03.680 spdk_app_start Round 0 00:06:03.680 10:37:19 -- event/event.sh@25 -- # waitforlisten 1970441 /var/tmp/spdk-nbd.sock 00:06:03.680 10:37:19 -- common/autotest_common.sh@819 -- # '[' -z 1970441 ']' 00:06:03.680 10:37:19 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:06:03.680 10:37:19 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:03.680 10:37:19 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:06:03.680 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:06:03.680 10:37:19 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:03.680 10:37:19 -- common/autotest_common.sh@10 -- # set +x 00:06:03.680 [2024-07-13 10:37:19.856754] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:03.680 [2024-07-13 10:37:19.856819] [ DPDK EAL parameters: app_repeat --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1970441 ] 00:06:03.680 EAL: No free 2048 kB hugepages reported on node 1 00:06:03.680 [2024-07-13 10:37:19.922480] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:03.680 [2024-07-13 10:37:19.960975] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:03.680 [2024-07-13 10:37:19.960977] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:04.617 10:37:20 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:04.617 10:37:20 -- common/autotest_common.sh@852 -- # return 0 00:06:04.617 10:37:20 -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:04.617 Malloc0 00:06:04.617 10:37:20 -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:04.877 Malloc1 00:06:04.877 10:37:21 -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:04.877 10:37:21 -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:04.877 10:37:21 -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:04.877 10:37:21 -- bdev/nbd_common.sh@91 -- # local bdev_list 00:06:04.877 10:37:21 -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:04.877 10:37:21 -- bdev/nbd_common.sh@92 -- # local nbd_list 00:06:04.877 10:37:21 -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:04.877 10:37:21 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:04.877 10:37:21 -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:04.877 10:37:21 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:06:04.877 10:37:21 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:04.877 10:37:21 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:06:04.877 10:37:21 -- bdev/nbd_common.sh@12 -- # local i 00:06:04.877 10:37:21 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:06:04.877 10:37:21 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:04.877 10:37:21 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:06:04.877 /dev/nbd0 00:06:04.877 10:37:21 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:06:04.877 10:37:21 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:06:04.877 10:37:21 -- common/autotest_common.sh@856 -- # local nbd_name=nbd0 00:06:04.877 10:37:21 -- common/autotest_common.sh@857 -- # local i 00:06:04.877 10:37:21 -- common/autotest_common.sh@859 -- # (( i = 1 )) 00:06:04.877 10:37:21 -- common/autotest_common.sh@859 -- # (( i <= 20 )) 00:06:04.877 10:37:21 -- common/autotest_common.sh@860 -- # grep -q -w nbd0 /proc/partitions 00:06:04.877 10:37:21 -- common/autotest_common.sh@861 -- # break 00:06:04.877 10:37:21 -- common/autotest_common.sh@872 -- # (( i = 1 )) 00:06:04.877 10:37:21 -- common/autotest_common.sh@872 -- # (( i <= 20 )) 00:06:04.877 10:37:21 -- common/autotest_common.sh@873 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:04.877 1+0 records in 00:06:04.877 1+0 records out 00:06:04.877 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000231979 s, 17.7 MB/s 00:06:04.877 10:37:21 -- common/autotest_common.sh@874 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:04.877 10:37:21 -- common/autotest_common.sh@874 -- # size=4096 00:06:04.877 10:37:21 -- common/autotest_common.sh@875 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:05.137 10:37:21 -- common/autotest_common.sh@876 -- # '[' 4096 '!=' 0 ']' 00:06:05.137 10:37:21 -- common/autotest_common.sh@877 -- # return 0 00:06:05.137 10:37:21 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:05.137 10:37:21 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:05.137 10:37:21 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:06:05.137 /dev/nbd1 00:06:05.137 10:37:21 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:06:05.137 10:37:21 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:06:05.137 10:37:21 -- common/autotest_common.sh@856 -- # local nbd_name=nbd1 00:06:05.137 10:37:21 -- common/autotest_common.sh@857 -- # local i 00:06:05.137 10:37:21 -- common/autotest_common.sh@859 -- # (( i = 1 )) 00:06:05.137 10:37:21 -- common/autotest_common.sh@859 -- # (( i <= 20 )) 00:06:05.137 10:37:21 -- common/autotest_common.sh@860 -- # grep -q -w nbd1 /proc/partitions 00:06:05.137 10:37:21 -- common/autotest_common.sh@861 -- # break 00:06:05.137 10:37:21 -- common/autotest_common.sh@872 -- # (( i = 1 )) 00:06:05.137 10:37:21 -- common/autotest_common.sh@872 -- # (( i <= 20 )) 00:06:05.137 10:37:21 -- common/autotest_common.sh@873 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:05.137 1+0 records in 00:06:05.137 1+0 records out 00:06:05.137 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000224277 s, 18.3 MB/s 00:06:05.137 10:37:21 -- common/autotest_common.sh@874 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:05.137 10:37:21 -- common/autotest_common.sh@874 -- # size=4096 00:06:05.137 10:37:21 -- common/autotest_common.sh@875 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:05.137 10:37:21 -- common/autotest_common.sh@876 -- # '[' 4096 '!=' 0 ']' 00:06:05.137 10:37:21 -- common/autotest_common.sh@877 -- # return 0 00:06:05.137 10:37:21 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:05.137 10:37:21 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:05.137 10:37:21 -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:05.137 10:37:21 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:05.137 10:37:21 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:05.396 10:37:21 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:06:05.396 { 00:06:05.396 "nbd_device": "/dev/nbd0", 00:06:05.396 "bdev_name": "Malloc0" 00:06:05.396 }, 00:06:05.396 { 00:06:05.396 "nbd_device": "/dev/nbd1", 00:06:05.396 "bdev_name": "Malloc1" 00:06:05.396 } 00:06:05.396 ]' 00:06:05.396 10:37:21 -- bdev/nbd_common.sh@64 -- # echo '[ 00:06:05.396 { 00:06:05.396 "nbd_device": "/dev/nbd0", 00:06:05.396 "bdev_name": "Malloc0" 00:06:05.396 }, 00:06:05.396 { 00:06:05.396 "nbd_device": "/dev/nbd1", 00:06:05.396 "bdev_name": "Malloc1" 00:06:05.396 } 00:06:05.396 ]' 00:06:05.396 10:37:21 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:05.396 10:37:21 -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:06:05.396 /dev/nbd1' 00:06:05.396 10:37:21 -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:06:05.396 /dev/nbd1' 00:06:05.396 10:37:21 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:05.396 10:37:21 -- bdev/nbd_common.sh@65 -- # count=2 00:06:05.396 10:37:21 -- bdev/nbd_common.sh@66 -- # echo 2 00:06:05.396 10:37:21 -- bdev/nbd_common.sh@95 -- # count=2 00:06:05.396 10:37:21 -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:06:05.396 10:37:21 -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:06:05.396 10:37:21 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:05.396 10:37:21 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:05.396 10:37:21 -- bdev/nbd_common.sh@71 -- # local operation=write 00:06:05.396 10:37:21 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:05.396 10:37:21 -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:06:05.396 10:37:21 -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:06:05.396 256+0 records in 00:06:05.396 256+0 records out 00:06:05.396 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0101559 s, 103 MB/s 00:06:05.396 10:37:21 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:05.396 10:37:21 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:06:05.396 256+0 records in 00:06:05.396 256+0 records out 00:06:05.396 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0201803 s, 52.0 MB/s 00:06:05.396 10:37:21 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:05.396 10:37:21 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:06:05.396 256+0 records in 00:06:05.396 256+0 records out 00:06:05.396 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0213891 s, 49.0 MB/s 00:06:05.396 10:37:21 -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:06:05.396 10:37:21 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:05.396 10:37:21 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:05.396 10:37:21 -- bdev/nbd_common.sh@71 -- # local operation=verify 00:06:05.396 10:37:21 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:05.396 10:37:21 -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:06:05.396 10:37:21 -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:06:05.396 10:37:21 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:05.396 10:37:21 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:06:05.396 10:37:21 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:05.396 10:37:21 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:06:05.396 10:37:21 -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:05.396 10:37:21 -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:06:05.396 10:37:21 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:05.396 10:37:21 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:05.396 10:37:21 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:06:05.396 10:37:21 -- bdev/nbd_common.sh@51 -- # local i 00:06:05.396 10:37:21 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:05.397 10:37:21 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:06:05.655 10:37:21 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:06:05.655 10:37:21 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:06:05.655 10:37:21 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:06:05.655 10:37:21 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:05.655 10:37:21 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:05.655 10:37:21 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:06:05.655 10:37:21 -- bdev/nbd_common.sh@41 -- # break 00:06:05.655 10:37:21 -- bdev/nbd_common.sh@45 -- # return 0 00:06:05.655 10:37:21 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:05.655 10:37:21 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:06:05.914 10:37:22 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:06:05.914 10:37:22 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:06:05.914 10:37:22 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:06:05.914 10:37:22 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:05.914 10:37:22 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:05.914 10:37:22 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:06:05.914 10:37:22 -- bdev/nbd_common.sh@41 -- # break 00:06:05.914 10:37:22 -- bdev/nbd_common.sh@45 -- # return 0 00:06:05.914 10:37:22 -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:05.914 10:37:22 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:05.915 10:37:22 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:06.174 10:37:22 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:06:06.174 10:37:22 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:06:06.174 10:37:22 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:06.174 10:37:22 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:06:06.174 10:37:22 -- bdev/nbd_common.sh@65 -- # echo '' 00:06:06.174 10:37:22 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:06.174 10:37:22 -- bdev/nbd_common.sh@65 -- # true 00:06:06.174 10:37:22 -- bdev/nbd_common.sh@65 -- # count=0 00:06:06.174 10:37:22 -- bdev/nbd_common.sh@66 -- # echo 0 00:06:06.174 10:37:22 -- bdev/nbd_common.sh@104 -- # count=0 00:06:06.174 10:37:22 -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:06:06.174 10:37:22 -- bdev/nbd_common.sh@109 -- # return 0 00:06:06.174 10:37:22 -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:06:06.174 10:37:22 -- event/event.sh@35 -- # sleep 3 00:06:06.432 [2024-07-13 10:37:22.728320] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:06.432 [2024-07-13 10:37:22.760456] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:06.432 [2024-07-13 10:37:22.760461] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:06.432 [2024-07-13 10:37:22.801074] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:06:06.432 [2024-07-13 10:37:22.801115] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:06:09.719 10:37:25 -- event/event.sh@23 -- # for i in {0..2} 00:06:09.719 10:37:25 -- event/event.sh@24 -- # echo 'spdk_app_start Round 1' 00:06:09.719 spdk_app_start Round 1 00:06:09.719 10:37:25 -- event/event.sh@25 -- # waitforlisten 1970441 /var/tmp/spdk-nbd.sock 00:06:09.719 10:37:25 -- common/autotest_common.sh@819 -- # '[' -z 1970441 ']' 00:06:09.719 10:37:25 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:06:09.719 10:37:25 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:09.719 10:37:25 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:06:09.719 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:06:09.719 10:37:25 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:09.719 10:37:25 -- common/autotest_common.sh@10 -- # set +x 00:06:09.719 10:37:25 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:09.719 10:37:25 -- common/autotest_common.sh@852 -- # return 0 00:06:09.719 10:37:25 -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:09.719 Malloc0 00:06:09.719 10:37:25 -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:09.719 Malloc1 00:06:09.719 10:37:26 -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:09.719 10:37:26 -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:09.719 10:37:26 -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:09.719 10:37:26 -- bdev/nbd_common.sh@91 -- # local bdev_list 00:06:09.719 10:37:26 -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:09.719 10:37:26 -- bdev/nbd_common.sh@92 -- # local nbd_list 00:06:09.719 10:37:26 -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:09.719 10:37:26 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:09.719 10:37:26 -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:09.719 10:37:26 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:06:09.719 10:37:26 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:09.719 10:37:26 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:06:09.719 10:37:26 -- bdev/nbd_common.sh@12 -- # local i 00:06:09.719 10:37:26 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:06:09.719 10:37:26 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:09.719 10:37:26 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:06:09.979 /dev/nbd0 00:06:09.979 10:37:26 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:06:09.979 10:37:26 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:06:09.979 10:37:26 -- common/autotest_common.sh@856 -- # local nbd_name=nbd0 00:06:09.979 10:37:26 -- common/autotest_common.sh@857 -- # local i 00:06:09.979 10:37:26 -- common/autotest_common.sh@859 -- # (( i = 1 )) 00:06:09.979 10:37:26 -- common/autotest_common.sh@859 -- # (( i <= 20 )) 00:06:09.979 10:37:26 -- common/autotest_common.sh@860 -- # grep -q -w nbd0 /proc/partitions 00:06:09.979 10:37:26 -- common/autotest_common.sh@861 -- # break 00:06:09.979 10:37:26 -- common/autotest_common.sh@872 -- # (( i = 1 )) 00:06:09.979 10:37:26 -- common/autotest_common.sh@872 -- # (( i <= 20 )) 00:06:09.979 10:37:26 -- common/autotest_common.sh@873 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:09.979 1+0 records in 00:06:09.979 1+0 records out 00:06:09.979 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000210639 s, 19.4 MB/s 00:06:09.979 10:37:26 -- common/autotest_common.sh@874 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:09.979 10:37:26 -- common/autotest_common.sh@874 -- # size=4096 00:06:09.979 10:37:26 -- common/autotest_common.sh@875 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:09.979 10:37:26 -- common/autotest_common.sh@876 -- # '[' 4096 '!=' 0 ']' 00:06:09.979 10:37:26 -- common/autotest_common.sh@877 -- # return 0 00:06:09.979 10:37:26 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:09.979 10:37:26 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:09.979 10:37:26 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:06:10.238 /dev/nbd1 00:06:10.238 10:37:26 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:06:10.238 10:37:26 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:06:10.238 10:37:26 -- common/autotest_common.sh@856 -- # local nbd_name=nbd1 00:06:10.238 10:37:26 -- common/autotest_common.sh@857 -- # local i 00:06:10.238 10:37:26 -- common/autotest_common.sh@859 -- # (( i = 1 )) 00:06:10.238 10:37:26 -- common/autotest_common.sh@859 -- # (( i <= 20 )) 00:06:10.239 10:37:26 -- common/autotest_common.sh@860 -- # grep -q -w nbd1 /proc/partitions 00:06:10.239 10:37:26 -- common/autotest_common.sh@861 -- # break 00:06:10.239 10:37:26 -- common/autotest_common.sh@872 -- # (( i = 1 )) 00:06:10.239 10:37:26 -- common/autotest_common.sh@872 -- # (( i <= 20 )) 00:06:10.239 10:37:26 -- common/autotest_common.sh@873 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:10.239 1+0 records in 00:06:10.239 1+0 records out 00:06:10.239 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000282848 s, 14.5 MB/s 00:06:10.239 10:37:26 -- common/autotest_common.sh@874 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:10.239 10:37:26 -- common/autotest_common.sh@874 -- # size=4096 00:06:10.239 10:37:26 -- common/autotest_common.sh@875 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:10.239 10:37:26 -- common/autotest_common.sh@876 -- # '[' 4096 '!=' 0 ']' 00:06:10.239 10:37:26 -- common/autotest_common.sh@877 -- # return 0 00:06:10.239 10:37:26 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:10.239 10:37:26 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:10.239 10:37:26 -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:10.239 10:37:26 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:10.239 10:37:26 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:10.499 10:37:26 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:06:10.499 { 00:06:10.499 "nbd_device": "/dev/nbd0", 00:06:10.499 "bdev_name": "Malloc0" 00:06:10.499 }, 00:06:10.499 { 00:06:10.499 "nbd_device": "/dev/nbd1", 00:06:10.499 "bdev_name": "Malloc1" 00:06:10.499 } 00:06:10.499 ]' 00:06:10.499 10:37:26 -- bdev/nbd_common.sh@64 -- # echo '[ 00:06:10.499 { 00:06:10.499 "nbd_device": "/dev/nbd0", 00:06:10.499 "bdev_name": "Malloc0" 00:06:10.499 }, 00:06:10.499 { 00:06:10.499 "nbd_device": "/dev/nbd1", 00:06:10.499 "bdev_name": "Malloc1" 00:06:10.499 } 00:06:10.499 ]' 00:06:10.499 10:37:26 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:10.499 10:37:26 -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:06:10.499 /dev/nbd1' 00:06:10.499 10:37:26 -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:06:10.499 /dev/nbd1' 00:06:10.499 10:37:26 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:10.499 10:37:26 -- bdev/nbd_common.sh@65 -- # count=2 00:06:10.499 10:37:26 -- bdev/nbd_common.sh@66 -- # echo 2 00:06:10.499 10:37:26 -- bdev/nbd_common.sh@95 -- # count=2 00:06:10.499 10:37:26 -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:06:10.499 10:37:26 -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:06:10.499 10:37:26 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:10.499 10:37:26 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:10.499 10:37:26 -- bdev/nbd_common.sh@71 -- # local operation=write 00:06:10.499 10:37:26 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:10.499 10:37:26 -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:06:10.499 10:37:26 -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:06:10.499 256+0 records in 00:06:10.499 256+0 records out 00:06:10.499 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00456092 s, 230 MB/s 00:06:10.499 10:37:26 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:10.499 10:37:26 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:06:10.499 256+0 records in 00:06:10.499 256+0 records out 00:06:10.499 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0201898 s, 51.9 MB/s 00:06:10.499 10:37:26 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:10.499 10:37:26 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:06:10.499 256+0 records in 00:06:10.499 256+0 records out 00:06:10.499 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0217861 s, 48.1 MB/s 00:06:10.499 10:37:26 -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:06:10.499 10:37:26 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:10.499 10:37:26 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:10.499 10:37:26 -- bdev/nbd_common.sh@71 -- # local operation=verify 00:06:10.499 10:37:26 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:10.499 10:37:26 -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:06:10.499 10:37:26 -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:06:10.499 10:37:26 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:10.499 10:37:26 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:06:10.499 10:37:26 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:10.499 10:37:26 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:06:10.499 10:37:26 -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:10.499 10:37:26 -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:06:10.499 10:37:26 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:10.499 10:37:26 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:10.499 10:37:26 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:06:10.499 10:37:26 -- bdev/nbd_common.sh@51 -- # local i 00:06:10.499 10:37:26 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:10.499 10:37:26 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:06:10.759 10:37:26 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:06:10.759 10:37:26 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:06:10.759 10:37:26 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:06:10.759 10:37:26 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:10.759 10:37:26 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:10.759 10:37:26 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:06:10.759 10:37:26 -- bdev/nbd_common.sh@41 -- # break 00:06:10.759 10:37:26 -- bdev/nbd_common.sh@45 -- # return 0 00:06:10.759 10:37:26 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:10.759 10:37:26 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:06:11.018 10:37:27 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:06:11.018 10:37:27 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:06:11.018 10:37:27 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:06:11.018 10:37:27 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:11.018 10:37:27 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:11.018 10:37:27 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:06:11.018 10:37:27 -- bdev/nbd_common.sh@41 -- # break 00:06:11.018 10:37:27 -- bdev/nbd_common.sh@45 -- # return 0 00:06:11.018 10:37:27 -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:11.018 10:37:27 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:11.018 10:37:27 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:11.018 10:37:27 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:06:11.018 10:37:27 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:06:11.018 10:37:27 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:11.018 10:37:27 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:06:11.018 10:37:27 -- bdev/nbd_common.sh@65 -- # echo '' 00:06:11.018 10:37:27 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:11.018 10:37:27 -- bdev/nbd_common.sh@65 -- # true 00:06:11.018 10:37:27 -- bdev/nbd_common.sh@65 -- # count=0 00:06:11.018 10:37:27 -- bdev/nbd_common.sh@66 -- # echo 0 00:06:11.018 10:37:27 -- bdev/nbd_common.sh@104 -- # count=0 00:06:11.018 10:37:27 -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:06:11.018 10:37:27 -- bdev/nbd_common.sh@109 -- # return 0 00:06:11.018 10:37:27 -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:06:11.278 10:37:27 -- event/event.sh@35 -- # sleep 3 00:06:11.536 [2024-07-13 10:37:27.740678] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:11.537 [2024-07-13 10:37:27.773158] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:11.537 [2024-07-13 10:37:27.773161] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:11.537 [2024-07-13 10:37:27.813839] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:06:11.537 [2024-07-13 10:37:27.813880] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:06:14.827 10:37:30 -- event/event.sh@23 -- # for i in {0..2} 00:06:14.827 10:37:30 -- event/event.sh@24 -- # echo 'spdk_app_start Round 2' 00:06:14.827 spdk_app_start Round 2 00:06:14.827 10:37:30 -- event/event.sh@25 -- # waitforlisten 1970441 /var/tmp/spdk-nbd.sock 00:06:14.827 10:37:30 -- common/autotest_common.sh@819 -- # '[' -z 1970441 ']' 00:06:14.827 10:37:30 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:06:14.827 10:37:30 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:14.827 10:37:30 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:06:14.827 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:06:14.827 10:37:30 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:14.827 10:37:30 -- common/autotest_common.sh@10 -- # set +x 00:06:14.827 10:37:30 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:14.827 10:37:30 -- common/autotest_common.sh@852 -- # return 0 00:06:14.827 10:37:30 -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:14.827 Malloc0 00:06:14.827 10:37:30 -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:14.827 Malloc1 00:06:14.827 10:37:31 -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:14.827 10:37:31 -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:14.827 10:37:31 -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:14.827 10:37:31 -- bdev/nbd_common.sh@91 -- # local bdev_list 00:06:14.827 10:37:31 -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:14.827 10:37:31 -- bdev/nbd_common.sh@92 -- # local nbd_list 00:06:14.827 10:37:31 -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:14.827 10:37:31 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:14.827 10:37:31 -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:14.827 10:37:31 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:06:14.827 10:37:31 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:14.827 10:37:31 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:06:14.827 10:37:31 -- bdev/nbd_common.sh@12 -- # local i 00:06:14.827 10:37:31 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:06:14.827 10:37:31 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:14.827 10:37:31 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:06:15.086 /dev/nbd0 00:06:15.086 10:37:31 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:06:15.086 10:37:31 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:06:15.086 10:37:31 -- common/autotest_common.sh@856 -- # local nbd_name=nbd0 00:06:15.086 10:37:31 -- common/autotest_common.sh@857 -- # local i 00:06:15.086 10:37:31 -- common/autotest_common.sh@859 -- # (( i = 1 )) 00:06:15.086 10:37:31 -- common/autotest_common.sh@859 -- # (( i <= 20 )) 00:06:15.086 10:37:31 -- common/autotest_common.sh@860 -- # grep -q -w nbd0 /proc/partitions 00:06:15.086 10:37:31 -- common/autotest_common.sh@861 -- # break 00:06:15.086 10:37:31 -- common/autotest_common.sh@872 -- # (( i = 1 )) 00:06:15.086 10:37:31 -- common/autotest_common.sh@872 -- # (( i <= 20 )) 00:06:15.086 10:37:31 -- common/autotest_common.sh@873 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:15.086 1+0 records in 00:06:15.086 1+0 records out 00:06:15.086 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000229758 s, 17.8 MB/s 00:06:15.086 10:37:31 -- common/autotest_common.sh@874 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:15.086 10:37:31 -- common/autotest_common.sh@874 -- # size=4096 00:06:15.086 10:37:31 -- common/autotest_common.sh@875 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:15.086 10:37:31 -- common/autotest_common.sh@876 -- # '[' 4096 '!=' 0 ']' 00:06:15.086 10:37:31 -- common/autotest_common.sh@877 -- # return 0 00:06:15.086 10:37:31 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:15.086 10:37:31 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:15.086 10:37:31 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:06:15.086 /dev/nbd1 00:06:15.086 10:37:31 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:06:15.086 10:37:31 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:06:15.086 10:37:31 -- common/autotest_common.sh@856 -- # local nbd_name=nbd1 00:06:15.086 10:37:31 -- common/autotest_common.sh@857 -- # local i 00:06:15.086 10:37:31 -- common/autotest_common.sh@859 -- # (( i = 1 )) 00:06:15.086 10:37:31 -- common/autotest_common.sh@859 -- # (( i <= 20 )) 00:06:15.086 10:37:31 -- common/autotest_common.sh@860 -- # grep -q -w nbd1 /proc/partitions 00:06:15.346 10:37:31 -- common/autotest_common.sh@861 -- # break 00:06:15.346 10:37:31 -- common/autotest_common.sh@872 -- # (( i = 1 )) 00:06:15.346 10:37:31 -- common/autotest_common.sh@872 -- # (( i <= 20 )) 00:06:15.346 10:37:31 -- common/autotest_common.sh@873 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:15.346 1+0 records in 00:06:15.346 1+0 records out 00:06:15.346 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000252759 s, 16.2 MB/s 00:06:15.346 10:37:31 -- common/autotest_common.sh@874 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:15.346 10:37:31 -- common/autotest_common.sh@874 -- # size=4096 00:06:15.346 10:37:31 -- common/autotest_common.sh@875 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:15.346 10:37:31 -- common/autotest_common.sh@876 -- # '[' 4096 '!=' 0 ']' 00:06:15.346 10:37:31 -- common/autotest_common.sh@877 -- # return 0 00:06:15.346 10:37:31 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:15.346 10:37:31 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:15.346 10:37:31 -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:15.346 10:37:31 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:15.346 10:37:31 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:15.346 10:37:31 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:06:15.346 { 00:06:15.346 "nbd_device": "/dev/nbd0", 00:06:15.346 "bdev_name": "Malloc0" 00:06:15.346 }, 00:06:15.346 { 00:06:15.346 "nbd_device": "/dev/nbd1", 00:06:15.346 "bdev_name": "Malloc1" 00:06:15.346 } 00:06:15.346 ]' 00:06:15.346 10:37:31 -- bdev/nbd_common.sh@64 -- # echo '[ 00:06:15.346 { 00:06:15.346 "nbd_device": "/dev/nbd0", 00:06:15.346 "bdev_name": "Malloc0" 00:06:15.346 }, 00:06:15.346 { 00:06:15.346 "nbd_device": "/dev/nbd1", 00:06:15.346 "bdev_name": "Malloc1" 00:06:15.346 } 00:06:15.346 ]' 00:06:15.346 10:37:31 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:15.346 10:37:31 -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:06:15.346 /dev/nbd1' 00:06:15.346 10:37:31 -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:06:15.346 /dev/nbd1' 00:06:15.346 10:37:31 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:15.346 10:37:31 -- bdev/nbd_common.sh@65 -- # count=2 00:06:15.346 10:37:31 -- bdev/nbd_common.sh@66 -- # echo 2 00:06:15.346 10:37:31 -- bdev/nbd_common.sh@95 -- # count=2 00:06:15.346 10:37:31 -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:06:15.346 10:37:31 -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:06:15.346 10:37:31 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:15.346 10:37:31 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:15.346 10:37:31 -- bdev/nbd_common.sh@71 -- # local operation=write 00:06:15.346 10:37:31 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:15.346 10:37:31 -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:06:15.346 10:37:31 -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:06:15.346 256+0 records in 00:06:15.346 256+0 records out 00:06:15.346 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0113617 s, 92.3 MB/s 00:06:15.346 10:37:31 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:15.346 10:37:31 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:06:15.606 256+0 records in 00:06:15.606 256+0 records out 00:06:15.606 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0202476 s, 51.8 MB/s 00:06:15.606 10:37:31 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:15.606 10:37:31 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:06:15.606 256+0 records in 00:06:15.606 256+0 records out 00:06:15.606 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0214163 s, 49.0 MB/s 00:06:15.606 10:37:31 -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:06:15.606 10:37:31 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:15.606 10:37:31 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:15.606 10:37:31 -- bdev/nbd_common.sh@71 -- # local operation=verify 00:06:15.606 10:37:31 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:15.606 10:37:31 -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:06:15.606 10:37:31 -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:06:15.606 10:37:31 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:15.606 10:37:31 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:06:15.606 10:37:31 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:15.606 10:37:31 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:06:15.606 10:37:31 -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:15.606 10:37:31 -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:06:15.606 10:37:31 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:15.606 10:37:31 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:15.606 10:37:31 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:06:15.606 10:37:31 -- bdev/nbd_common.sh@51 -- # local i 00:06:15.606 10:37:31 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:15.606 10:37:31 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:06:15.606 10:37:31 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:06:15.606 10:37:31 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:06:15.606 10:37:31 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:06:15.606 10:37:31 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:15.606 10:37:31 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:15.606 10:37:31 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:06:15.606 10:37:31 -- bdev/nbd_common.sh@41 -- # break 00:06:15.606 10:37:31 -- bdev/nbd_common.sh@45 -- # return 0 00:06:15.606 10:37:31 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:15.606 10:37:31 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:06:15.866 10:37:32 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:06:15.866 10:37:32 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:06:15.866 10:37:32 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:06:15.866 10:37:32 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:15.866 10:37:32 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:15.866 10:37:32 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:06:15.866 10:37:32 -- bdev/nbd_common.sh@41 -- # break 00:06:15.866 10:37:32 -- bdev/nbd_common.sh@45 -- # return 0 00:06:15.866 10:37:32 -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:15.866 10:37:32 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:15.866 10:37:32 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:16.125 10:37:32 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:06:16.125 10:37:32 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:06:16.125 10:37:32 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:16.125 10:37:32 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:06:16.125 10:37:32 -- bdev/nbd_common.sh@65 -- # echo '' 00:06:16.125 10:37:32 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:16.125 10:37:32 -- bdev/nbd_common.sh@65 -- # true 00:06:16.125 10:37:32 -- bdev/nbd_common.sh@65 -- # count=0 00:06:16.125 10:37:32 -- bdev/nbd_common.sh@66 -- # echo 0 00:06:16.125 10:37:32 -- bdev/nbd_common.sh@104 -- # count=0 00:06:16.125 10:37:32 -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:06:16.125 10:37:32 -- bdev/nbd_common.sh@109 -- # return 0 00:06:16.125 10:37:32 -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:06:16.384 10:37:32 -- event/event.sh@35 -- # sleep 3 00:06:16.384 [2024-07-13 10:37:32.737855] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:16.384 [2024-07-13 10:37:32.770286] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:16.384 [2024-07-13 10:37:32.770289] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:16.644 [2024-07-13 10:37:32.810952] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:06:16.644 [2024-07-13 10:37:32.810991] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:06:19.932 10:37:35 -- event/event.sh@38 -- # waitforlisten 1970441 /var/tmp/spdk-nbd.sock 00:06:19.932 10:37:35 -- common/autotest_common.sh@819 -- # '[' -z 1970441 ']' 00:06:19.932 10:37:35 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:06:19.932 10:37:35 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:19.932 10:37:35 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:06:19.932 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:06:19.932 10:37:35 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:19.932 10:37:35 -- common/autotest_common.sh@10 -- # set +x 00:06:19.932 10:37:35 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:19.932 10:37:35 -- common/autotest_common.sh@852 -- # return 0 00:06:19.932 10:37:35 -- event/event.sh@39 -- # killprocess 1970441 00:06:19.932 10:37:35 -- common/autotest_common.sh@926 -- # '[' -z 1970441 ']' 00:06:19.932 10:37:35 -- common/autotest_common.sh@930 -- # kill -0 1970441 00:06:19.932 10:37:35 -- common/autotest_common.sh@931 -- # uname 00:06:19.932 10:37:35 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:06:19.932 10:37:35 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 1970441 00:06:19.932 10:37:35 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:06:19.932 10:37:35 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:06:19.932 10:37:35 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 1970441' 00:06:19.932 killing process with pid 1970441 00:06:19.932 10:37:35 -- common/autotest_common.sh@945 -- # kill 1970441 00:06:19.932 10:37:35 -- common/autotest_common.sh@950 -- # wait 1970441 00:06:19.932 spdk_app_start is called in Round 0. 00:06:19.932 Shutdown signal received, stop current app iteration 00:06:19.932 Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 reinitialization... 00:06:19.932 spdk_app_start is called in Round 1. 00:06:19.932 Shutdown signal received, stop current app iteration 00:06:19.932 Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 reinitialization... 00:06:19.932 spdk_app_start is called in Round 2. 00:06:19.932 Shutdown signal received, stop current app iteration 00:06:19.932 Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 reinitialization... 00:06:19.932 spdk_app_start is called in Round 3. 00:06:19.932 Shutdown signal received, stop current app iteration 00:06:19.932 10:37:35 -- event/event.sh@40 -- # trap - SIGINT SIGTERM EXIT 00:06:19.932 10:37:35 -- event/event.sh@42 -- # return 0 00:06:19.932 00:06:19.932 real 0m16.101s 00:06:19.932 user 0m34.272s 00:06:19.932 sys 0m3.048s 00:06:19.932 10:37:35 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:19.932 10:37:35 -- common/autotest_common.sh@10 -- # set +x 00:06:19.932 ************************************ 00:06:19.932 END TEST app_repeat 00:06:19.932 ************************************ 00:06:19.933 10:37:35 -- event/event.sh@54 -- # (( SPDK_TEST_CRYPTO == 0 )) 00:06:19.933 10:37:35 -- event/event.sh@55 -- # run_test cpu_locks /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/cpu_locks.sh 00:06:19.933 10:37:35 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:06:19.933 10:37:35 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:19.933 10:37:35 -- common/autotest_common.sh@10 -- # set +x 00:06:19.933 ************************************ 00:06:19.933 START TEST cpu_locks 00:06:19.933 ************************************ 00:06:19.933 10:37:35 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/cpu_locks.sh 00:06:19.933 * Looking for test storage... 00:06:19.933 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event 00:06:19.933 10:37:36 -- event/cpu_locks.sh@11 -- # rpc_sock1=/var/tmp/spdk.sock 00:06:19.933 10:37:36 -- event/cpu_locks.sh@12 -- # rpc_sock2=/var/tmp/spdk2.sock 00:06:19.933 10:37:36 -- event/cpu_locks.sh@164 -- # trap cleanup EXIT SIGTERM SIGINT 00:06:19.933 10:37:36 -- event/cpu_locks.sh@166 -- # run_test default_locks default_locks 00:06:19.933 10:37:36 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:06:19.933 10:37:36 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:19.933 10:37:36 -- common/autotest_common.sh@10 -- # set +x 00:06:19.933 ************************************ 00:06:19.933 START TEST default_locks 00:06:19.933 ************************************ 00:06:19.933 10:37:36 -- common/autotest_common.sh@1104 -- # default_locks 00:06:19.933 10:37:36 -- event/cpu_locks.sh@46 -- # spdk_tgt_pid=1973563 00:06:19.933 10:37:36 -- event/cpu_locks.sh@47 -- # waitforlisten 1973563 00:06:19.933 10:37:36 -- event/cpu_locks.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:19.933 10:37:36 -- common/autotest_common.sh@819 -- # '[' -z 1973563 ']' 00:06:19.933 10:37:36 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:19.933 10:37:36 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:19.933 10:37:36 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:19.933 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:19.933 10:37:36 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:19.933 10:37:36 -- common/autotest_common.sh@10 -- # set +x 00:06:19.933 [2024-07-13 10:37:36.117341] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:19.933 [2024-07-13 10:37:36.117433] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1973563 ] 00:06:19.933 EAL: No free 2048 kB hugepages reported on node 1 00:06:19.933 [2024-07-13 10:37:36.184800] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:19.933 [2024-07-13 10:37:36.220979] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:19.933 [2024-07-13 10:37:36.221092] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:20.869 10:37:36 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:20.869 10:37:36 -- common/autotest_common.sh@852 -- # return 0 00:06:20.869 10:37:36 -- event/cpu_locks.sh@49 -- # locks_exist 1973563 00:06:20.869 10:37:36 -- event/cpu_locks.sh@22 -- # lslocks -p 1973563 00:06:20.869 10:37:36 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:06:21.438 lslocks: write error 00:06:21.438 10:37:37 -- event/cpu_locks.sh@50 -- # killprocess 1973563 00:06:21.438 10:37:37 -- common/autotest_common.sh@926 -- # '[' -z 1973563 ']' 00:06:21.438 10:37:37 -- common/autotest_common.sh@930 -- # kill -0 1973563 00:06:21.438 10:37:37 -- common/autotest_common.sh@931 -- # uname 00:06:21.438 10:37:37 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:06:21.438 10:37:37 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 1973563 00:06:21.438 10:37:37 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:06:21.438 10:37:37 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:06:21.438 10:37:37 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 1973563' 00:06:21.438 killing process with pid 1973563 00:06:21.438 10:37:37 -- common/autotest_common.sh@945 -- # kill 1973563 00:06:21.438 10:37:37 -- common/autotest_common.sh@950 -- # wait 1973563 00:06:21.698 10:37:37 -- event/cpu_locks.sh@52 -- # NOT waitforlisten 1973563 00:06:21.698 10:37:37 -- common/autotest_common.sh@640 -- # local es=0 00:06:21.698 10:37:37 -- common/autotest_common.sh@642 -- # valid_exec_arg waitforlisten 1973563 00:06:21.698 10:37:37 -- common/autotest_common.sh@628 -- # local arg=waitforlisten 00:06:21.698 10:37:37 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:21.698 10:37:37 -- common/autotest_common.sh@632 -- # type -t waitforlisten 00:06:21.698 10:37:37 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:21.698 10:37:37 -- common/autotest_common.sh@643 -- # waitforlisten 1973563 00:06:21.698 10:37:37 -- common/autotest_common.sh@819 -- # '[' -z 1973563 ']' 00:06:21.698 10:37:37 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:21.698 10:37:37 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:21.698 10:37:37 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:21.698 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:21.698 10:37:37 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:21.698 10:37:37 -- common/autotest_common.sh@10 -- # set +x 00:06:21.698 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 834: kill: (1973563) - No such process 00:06:21.698 ERROR: process (pid: 1973563) is no longer running 00:06:21.698 10:37:37 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:21.698 10:37:37 -- common/autotest_common.sh@852 -- # return 1 00:06:21.698 10:37:37 -- common/autotest_common.sh@643 -- # es=1 00:06:21.698 10:37:37 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:06:21.698 10:37:37 -- common/autotest_common.sh@662 -- # [[ -n '' ]] 00:06:21.698 10:37:37 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:06:21.698 10:37:37 -- event/cpu_locks.sh@54 -- # no_locks 00:06:21.698 10:37:37 -- event/cpu_locks.sh@26 -- # lock_files=() 00:06:21.698 10:37:37 -- event/cpu_locks.sh@26 -- # local lock_files 00:06:21.698 10:37:37 -- event/cpu_locks.sh@27 -- # (( 0 != 0 )) 00:06:21.698 00:06:21.698 real 0m1.908s 00:06:21.698 user 0m1.993s 00:06:21.698 sys 0m0.690s 00:06:21.698 10:37:37 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:21.698 10:37:37 -- common/autotest_common.sh@10 -- # set +x 00:06:21.698 ************************************ 00:06:21.698 END TEST default_locks 00:06:21.698 ************************************ 00:06:21.698 10:37:38 -- event/cpu_locks.sh@167 -- # run_test default_locks_via_rpc default_locks_via_rpc 00:06:21.698 10:37:38 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:06:21.698 10:37:38 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:21.698 10:37:38 -- common/autotest_common.sh@10 -- # set +x 00:06:21.698 ************************************ 00:06:21.698 START TEST default_locks_via_rpc 00:06:21.698 ************************************ 00:06:21.698 10:37:38 -- common/autotest_common.sh@1104 -- # default_locks_via_rpc 00:06:21.698 10:37:38 -- event/cpu_locks.sh@62 -- # spdk_tgt_pid=1973866 00:06:21.698 10:37:38 -- event/cpu_locks.sh@63 -- # waitforlisten 1973866 00:06:21.698 10:37:38 -- event/cpu_locks.sh@61 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:21.698 10:37:38 -- common/autotest_common.sh@819 -- # '[' -z 1973866 ']' 00:06:21.698 10:37:38 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:21.698 10:37:38 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:21.698 10:37:38 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:21.698 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:21.698 10:37:38 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:21.698 10:37:38 -- common/autotest_common.sh@10 -- # set +x 00:06:21.698 [2024-07-13 10:37:38.075465] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:21.698 [2024-07-13 10:37:38.075554] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1973866 ] 00:06:21.957 EAL: No free 2048 kB hugepages reported on node 1 00:06:21.957 [2024-07-13 10:37:38.143679] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:21.957 [2024-07-13 10:37:38.180786] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:21.957 [2024-07-13 10:37:38.180891] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:22.523 10:37:38 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:22.523 10:37:38 -- common/autotest_common.sh@852 -- # return 0 00:06:22.523 10:37:38 -- event/cpu_locks.sh@65 -- # rpc_cmd framework_disable_cpumask_locks 00:06:22.523 10:37:38 -- common/autotest_common.sh@551 -- # xtrace_disable 00:06:22.523 10:37:38 -- common/autotest_common.sh@10 -- # set +x 00:06:22.523 10:37:38 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:06:22.523 10:37:38 -- event/cpu_locks.sh@67 -- # no_locks 00:06:22.523 10:37:38 -- event/cpu_locks.sh@26 -- # lock_files=() 00:06:22.523 10:37:38 -- event/cpu_locks.sh@26 -- # local lock_files 00:06:22.523 10:37:38 -- event/cpu_locks.sh@27 -- # (( 0 != 0 )) 00:06:22.523 10:37:38 -- event/cpu_locks.sh@69 -- # rpc_cmd framework_enable_cpumask_locks 00:06:22.523 10:37:38 -- common/autotest_common.sh@551 -- # xtrace_disable 00:06:22.523 10:37:38 -- common/autotest_common.sh@10 -- # set +x 00:06:22.523 10:37:38 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:06:22.523 10:37:38 -- event/cpu_locks.sh@71 -- # locks_exist 1973866 00:06:22.523 10:37:38 -- event/cpu_locks.sh@22 -- # lslocks -p 1973866 00:06:22.523 10:37:38 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:06:23.089 10:37:39 -- event/cpu_locks.sh@73 -- # killprocess 1973866 00:06:23.089 10:37:39 -- common/autotest_common.sh@926 -- # '[' -z 1973866 ']' 00:06:23.089 10:37:39 -- common/autotest_common.sh@930 -- # kill -0 1973866 00:06:23.089 10:37:39 -- common/autotest_common.sh@931 -- # uname 00:06:23.089 10:37:39 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:06:23.089 10:37:39 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 1973866 00:06:23.348 10:37:39 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:06:23.348 10:37:39 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:06:23.348 10:37:39 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 1973866' 00:06:23.348 killing process with pid 1973866 00:06:23.348 10:37:39 -- common/autotest_common.sh@945 -- # kill 1973866 00:06:23.348 10:37:39 -- common/autotest_common.sh@950 -- # wait 1973866 00:06:23.607 00:06:23.607 real 0m1.749s 00:06:23.607 user 0m1.839s 00:06:23.607 sys 0m0.583s 00:06:23.607 10:37:39 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:23.607 10:37:39 -- common/autotest_common.sh@10 -- # set +x 00:06:23.607 ************************************ 00:06:23.607 END TEST default_locks_via_rpc 00:06:23.607 ************************************ 00:06:23.607 10:37:39 -- event/cpu_locks.sh@168 -- # run_test non_locking_app_on_locked_coremask non_locking_app_on_locked_coremask 00:06:23.607 10:37:39 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:06:23.607 10:37:39 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:23.607 10:37:39 -- common/autotest_common.sh@10 -- # set +x 00:06:23.607 ************************************ 00:06:23.607 START TEST non_locking_app_on_locked_coremask 00:06:23.607 ************************************ 00:06:23.607 10:37:39 -- common/autotest_common.sh@1104 -- # non_locking_app_on_locked_coremask 00:06:23.607 10:37:39 -- event/cpu_locks.sh@80 -- # spdk_tgt_pid=1974246 00:06:23.607 10:37:39 -- event/cpu_locks.sh@81 -- # waitforlisten 1974246 /var/tmp/spdk.sock 00:06:23.607 10:37:39 -- event/cpu_locks.sh@79 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:23.607 10:37:39 -- common/autotest_common.sh@819 -- # '[' -z 1974246 ']' 00:06:23.607 10:37:39 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:23.607 10:37:39 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:23.607 10:37:39 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:23.607 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:23.607 10:37:39 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:23.607 10:37:39 -- common/autotest_common.sh@10 -- # set +x 00:06:23.607 [2024-07-13 10:37:39.874001] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:23.607 [2024-07-13 10:37:39.874093] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1974246 ] 00:06:23.607 EAL: No free 2048 kB hugepages reported on node 1 00:06:23.607 [2024-07-13 10:37:39.943052] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:23.607 [2024-07-13 10:37:39.980213] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:23.607 [2024-07-13 10:37:39.980320] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:24.664 10:37:40 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:24.664 10:37:40 -- common/autotest_common.sh@852 -- # return 0 00:06:24.664 10:37:40 -- event/cpu_locks.sh@83 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 --disable-cpumask-locks -r /var/tmp/spdk2.sock 00:06:24.664 10:37:40 -- event/cpu_locks.sh@84 -- # spdk_tgt_pid2=1974443 00:06:24.664 10:37:40 -- event/cpu_locks.sh@85 -- # waitforlisten 1974443 /var/tmp/spdk2.sock 00:06:24.664 10:37:40 -- common/autotest_common.sh@819 -- # '[' -z 1974443 ']' 00:06:24.664 10:37:40 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:24.664 10:37:40 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:24.664 10:37:40 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:24.664 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:24.664 10:37:40 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:24.664 10:37:40 -- common/autotest_common.sh@10 -- # set +x 00:06:24.664 [2024-07-13 10:37:40.685968] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:24.664 [2024-07-13 10:37:40.686017] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1974443 ] 00:06:24.664 EAL: No free 2048 kB hugepages reported on node 1 00:06:24.664 [2024-07-13 10:37:40.770583] app.c: 795:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:06:24.664 [2024-07-13 10:37:40.770603] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:24.664 [2024-07-13 10:37:40.847538] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:24.664 [2024-07-13 10:37:40.847658] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:25.231 10:37:41 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:25.231 10:37:41 -- common/autotest_common.sh@852 -- # return 0 00:06:25.231 10:37:41 -- event/cpu_locks.sh@87 -- # locks_exist 1974246 00:06:25.231 10:37:41 -- event/cpu_locks.sh@22 -- # lslocks -p 1974246 00:06:25.231 10:37:41 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:06:26.167 lslocks: write error 00:06:26.167 10:37:42 -- event/cpu_locks.sh@89 -- # killprocess 1974246 00:06:26.167 10:37:42 -- common/autotest_common.sh@926 -- # '[' -z 1974246 ']' 00:06:26.167 10:37:42 -- common/autotest_common.sh@930 -- # kill -0 1974246 00:06:26.167 10:37:42 -- common/autotest_common.sh@931 -- # uname 00:06:26.167 10:37:42 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:06:26.167 10:37:42 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 1974246 00:06:26.167 10:37:42 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:06:26.167 10:37:42 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:06:26.167 10:37:42 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 1974246' 00:06:26.167 killing process with pid 1974246 00:06:26.167 10:37:42 -- common/autotest_common.sh@945 -- # kill 1974246 00:06:26.167 10:37:42 -- common/autotest_common.sh@950 -- # wait 1974246 00:06:26.734 10:37:43 -- event/cpu_locks.sh@90 -- # killprocess 1974443 00:06:26.734 10:37:43 -- common/autotest_common.sh@926 -- # '[' -z 1974443 ']' 00:06:26.734 10:37:43 -- common/autotest_common.sh@930 -- # kill -0 1974443 00:06:26.734 10:37:43 -- common/autotest_common.sh@931 -- # uname 00:06:26.734 10:37:43 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:06:26.734 10:37:43 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 1974443 00:06:26.734 10:37:43 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:06:26.734 10:37:43 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:06:26.734 10:37:43 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 1974443' 00:06:26.734 killing process with pid 1974443 00:06:26.734 10:37:43 -- common/autotest_common.sh@945 -- # kill 1974443 00:06:26.734 10:37:43 -- common/autotest_common.sh@950 -- # wait 1974443 00:06:26.992 00:06:26.992 real 0m3.530s 00:06:26.992 user 0m3.720s 00:06:26.992 sys 0m1.134s 00:06:26.992 10:37:43 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:27.250 10:37:43 -- common/autotest_common.sh@10 -- # set +x 00:06:27.250 ************************************ 00:06:27.250 END TEST non_locking_app_on_locked_coremask 00:06:27.250 ************************************ 00:06:27.250 10:37:43 -- event/cpu_locks.sh@169 -- # run_test locking_app_on_unlocked_coremask locking_app_on_unlocked_coremask 00:06:27.250 10:37:43 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:06:27.250 10:37:43 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:27.250 10:37:43 -- common/autotest_common.sh@10 -- # set +x 00:06:27.250 ************************************ 00:06:27.250 START TEST locking_app_on_unlocked_coremask 00:06:27.250 ************************************ 00:06:27.250 10:37:43 -- common/autotest_common.sh@1104 -- # locking_app_on_unlocked_coremask 00:06:27.250 10:37:43 -- event/cpu_locks.sh@98 -- # spdk_tgt_pid=1975005 00:06:27.250 10:37:43 -- event/cpu_locks.sh@99 -- # waitforlisten 1975005 /var/tmp/spdk.sock 00:06:27.250 10:37:43 -- event/cpu_locks.sh@97 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 --disable-cpumask-locks 00:06:27.250 10:37:43 -- common/autotest_common.sh@819 -- # '[' -z 1975005 ']' 00:06:27.250 10:37:43 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:27.250 10:37:43 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:27.250 10:37:43 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:27.250 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:27.250 10:37:43 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:27.250 10:37:43 -- common/autotest_common.sh@10 -- # set +x 00:06:27.250 [2024-07-13 10:37:43.454842] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:27.250 [2024-07-13 10:37:43.454936] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1975005 ] 00:06:27.250 EAL: No free 2048 kB hugepages reported on node 1 00:06:27.250 [2024-07-13 10:37:43.523000] app.c: 795:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:06:27.250 [2024-07-13 10:37:43.523035] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:27.250 [2024-07-13 10:37:43.555711] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:27.250 [2024-07-13 10:37:43.555836] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:28.187 10:37:44 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:28.187 10:37:44 -- common/autotest_common.sh@852 -- # return 0 00:06:28.187 10:37:44 -- event/cpu_locks.sh@102 -- # spdk_tgt_pid2=1975027 00:06:28.187 10:37:44 -- event/cpu_locks.sh@103 -- # waitforlisten 1975027 /var/tmp/spdk2.sock 00:06:28.187 10:37:44 -- event/cpu_locks.sh@101 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -r /var/tmp/spdk2.sock 00:06:28.187 10:37:44 -- common/autotest_common.sh@819 -- # '[' -z 1975027 ']' 00:06:28.187 10:37:44 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:28.187 10:37:44 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:28.187 10:37:44 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:28.187 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:28.187 10:37:44 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:28.187 10:37:44 -- common/autotest_common.sh@10 -- # set +x 00:06:28.187 [2024-07-13 10:37:44.283039] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:28.187 [2024-07-13 10:37:44.283125] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1975027 ] 00:06:28.187 EAL: No free 2048 kB hugepages reported on node 1 00:06:28.187 [2024-07-13 10:37:44.382564] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:28.187 [2024-07-13 10:37:44.455061] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:28.187 [2024-07-13 10:37:44.455177] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:28.755 10:37:45 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:28.755 10:37:45 -- common/autotest_common.sh@852 -- # return 0 00:06:28.755 10:37:45 -- event/cpu_locks.sh@105 -- # locks_exist 1975027 00:06:28.755 10:37:45 -- event/cpu_locks.sh@22 -- # lslocks -p 1975027 00:06:28.755 10:37:45 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:06:29.689 lslocks: write error 00:06:29.689 10:37:45 -- event/cpu_locks.sh@107 -- # killprocess 1975005 00:06:29.689 10:37:45 -- common/autotest_common.sh@926 -- # '[' -z 1975005 ']' 00:06:29.689 10:37:45 -- common/autotest_common.sh@930 -- # kill -0 1975005 00:06:29.689 10:37:45 -- common/autotest_common.sh@931 -- # uname 00:06:29.689 10:37:45 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:06:29.689 10:37:45 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 1975005 00:06:29.689 10:37:45 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:06:29.689 10:37:45 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:06:29.689 10:37:45 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 1975005' 00:06:29.689 killing process with pid 1975005 00:06:29.689 10:37:45 -- common/autotest_common.sh@945 -- # kill 1975005 00:06:29.689 10:37:45 -- common/autotest_common.sh@950 -- # wait 1975005 00:06:30.258 10:37:46 -- event/cpu_locks.sh@108 -- # killprocess 1975027 00:06:30.258 10:37:46 -- common/autotest_common.sh@926 -- # '[' -z 1975027 ']' 00:06:30.258 10:37:46 -- common/autotest_common.sh@930 -- # kill -0 1975027 00:06:30.258 10:37:46 -- common/autotest_common.sh@931 -- # uname 00:06:30.258 10:37:46 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:06:30.258 10:37:46 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 1975027 00:06:30.258 10:37:46 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:06:30.258 10:37:46 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:06:30.258 10:37:46 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 1975027' 00:06:30.258 killing process with pid 1975027 00:06:30.258 10:37:46 -- common/autotest_common.sh@945 -- # kill 1975027 00:06:30.258 10:37:46 -- common/autotest_common.sh@950 -- # wait 1975027 00:06:30.828 00:06:30.828 real 0m3.480s 00:06:30.828 user 0m3.675s 00:06:30.828 sys 0m1.181s 00:06:30.828 10:37:46 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:30.828 10:37:46 -- common/autotest_common.sh@10 -- # set +x 00:06:30.828 ************************************ 00:06:30.828 END TEST locking_app_on_unlocked_coremask 00:06:30.828 ************************************ 00:06:30.828 10:37:46 -- event/cpu_locks.sh@170 -- # run_test locking_app_on_locked_coremask locking_app_on_locked_coremask 00:06:30.828 10:37:46 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:06:30.828 10:37:46 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:30.828 10:37:46 -- common/autotest_common.sh@10 -- # set +x 00:06:30.828 ************************************ 00:06:30.828 START TEST locking_app_on_locked_coremask 00:06:30.828 ************************************ 00:06:30.828 10:37:46 -- common/autotest_common.sh@1104 -- # locking_app_on_locked_coremask 00:06:30.828 10:37:46 -- event/cpu_locks.sh@115 -- # spdk_tgt_pid=1975594 00:06:30.828 10:37:46 -- event/cpu_locks.sh@116 -- # waitforlisten 1975594 /var/tmp/spdk.sock 00:06:30.828 10:37:46 -- event/cpu_locks.sh@114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:30.828 10:37:46 -- common/autotest_common.sh@819 -- # '[' -z 1975594 ']' 00:06:30.828 10:37:46 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:30.828 10:37:46 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:30.828 10:37:46 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:30.828 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:30.828 10:37:46 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:30.828 10:37:46 -- common/autotest_common.sh@10 -- # set +x 00:06:30.828 [2024-07-13 10:37:46.985684] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:30.828 [2024-07-13 10:37:46.985779] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1975594 ] 00:06:30.828 EAL: No free 2048 kB hugepages reported on node 1 00:06:30.828 [2024-07-13 10:37:47.054108] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:30.828 [2024-07-13 10:37:47.086647] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:30.828 [2024-07-13 10:37:47.086758] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:31.766 10:37:47 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:31.766 10:37:47 -- common/autotest_common.sh@852 -- # return 0 00:06:31.766 10:37:47 -- event/cpu_locks.sh@119 -- # spdk_tgt_pid2=1975785 00:06:31.766 10:37:47 -- event/cpu_locks.sh@120 -- # NOT waitforlisten 1975785 /var/tmp/spdk2.sock 00:06:31.766 10:37:47 -- event/cpu_locks.sh@118 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -r /var/tmp/spdk2.sock 00:06:31.766 10:37:47 -- common/autotest_common.sh@640 -- # local es=0 00:06:31.766 10:37:47 -- common/autotest_common.sh@642 -- # valid_exec_arg waitforlisten 1975785 /var/tmp/spdk2.sock 00:06:31.766 10:37:47 -- common/autotest_common.sh@628 -- # local arg=waitforlisten 00:06:31.766 10:37:47 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:31.766 10:37:47 -- common/autotest_common.sh@632 -- # type -t waitforlisten 00:06:31.766 10:37:47 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:31.766 10:37:47 -- common/autotest_common.sh@643 -- # waitforlisten 1975785 /var/tmp/spdk2.sock 00:06:31.766 10:37:47 -- common/autotest_common.sh@819 -- # '[' -z 1975785 ']' 00:06:31.766 10:37:47 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:31.766 10:37:47 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:31.766 10:37:47 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:31.766 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:31.766 10:37:47 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:31.766 10:37:47 -- common/autotest_common.sh@10 -- # set +x 00:06:31.766 [2024-07-13 10:37:47.816509] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:31.766 [2024-07-13 10:37:47.816598] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1975785 ] 00:06:31.766 EAL: No free 2048 kB hugepages reported on node 1 00:06:31.766 [2024-07-13 10:37:47.911083] app.c: 666:claim_cpu_cores: *ERROR*: Cannot create lock on core 0, probably process 1975594 has claimed it. 00:06:31.766 [2024-07-13 10:37:47.911124] app.c: 791:spdk_app_start: *ERROR*: Unable to acquire lock on assigned core mask - exiting. 00:06:32.334 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 834: kill: (1975785) - No such process 00:06:32.334 ERROR: process (pid: 1975785) is no longer running 00:06:32.334 10:37:48 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:32.334 10:37:48 -- common/autotest_common.sh@852 -- # return 1 00:06:32.334 10:37:48 -- common/autotest_common.sh@643 -- # es=1 00:06:32.334 10:37:48 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:06:32.334 10:37:48 -- common/autotest_common.sh@662 -- # [[ -n '' ]] 00:06:32.334 10:37:48 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:06:32.334 10:37:48 -- event/cpu_locks.sh@122 -- # locks_exist 1975594 00:06:32.334 10:37:48 -- event/cpu_locks.sh@22 -- # lslocks -p 1975594 00:06:32.334 10:37:48 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:06:32.903 lslocks: write error 00:06:32.903 10:37:49 -- event/cpu_locks.sh@124 -- # killprocess 1975594 00:06:32.903 10:37:49 -- common/autotest_common.sh@926 -- # '[' -z 1975594 ']' 00:06:32.903 10:37:49 -- common/autotest_common.sh@930 -- # kill -0 1975594 00:06:32.903 10:37:49 -- common/autotest_common.sh@931 -- # uname 00:06:32.903 10:37:49 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:06:32.903 10:37:49 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 1975594 00:06:32.903 10:37:49 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:06:32.903 10:37:49 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:06:32.903 10:37:49 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 1975594' 00:06:32.903 killing process with pid 1975594 00:06:32.903 10:37:49 -- common/autotest_common.sh@945 -- # kill 1975594 00:06:32.903 10:37:49 -- common/autotest_common.sh@950 -- # wait 1975594 00:06:33.163 00:06:33.163 real 0m2.476s 00:06:33.163 user 0m2.680s 00:06:33.163 sys 0m0.814s 00:06:33.163 10:37:49 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:33.163 10:37:49 -- common/autotest_common.sh@10 -- # set +x 00:06:33.163 ************************************ 00:06:33.163 END TEST locking_app_on_locked_coremask 00:06:33.163 ************************************ 00:06:33.163 10:37:49 -- event/cpu_locks.sh@171 -- # run_test locking_overlapped_coremask locking_overlapped_coremask 00:06:33.163 10:37:49 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:06:33.163 10:37:49 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:33.163 10:37:49 -- common/autotest_common.sh@10 -- # set +x 00:06:33.163 ************************************ 00:06:33.163 START TEST locking_overlapped_coremask 00:06:33.163 ************************************ 00:06:33.163 10:37:49 -- common/autotest_common.sh@1104 -- # locking_overlapped_coremask 00:06:33.163 10:37:49 -- event/cpu_locks.sh@132 -- # spdk_tgt_pid=1976156 00:06:33.163 10:37:49 -- event/cpu_locks.sh@133 -- # waitforlisten 1976156 /var/tmp/spdk.sock 00:06:33.163 10:37:49 -- event/cpu_locks.sh@131 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x7 00:06:33.163 10:37:49 -- common/autotest_common.sh@819 -- # '[' -z 1976156 ']' 00:06:33.163 10:37:49 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:33.163 10:37:49 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:33.163 10:37:49 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:33.163 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:33.163 10:37:49 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:33.163 10:37:49 -- common/autotest_common.sh@10 -- # set +x 00:06:33.163 [2024-07-13 10:37:49.508741] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:33.163 [2024-07-13 10:37:49.508811] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1976156 ] 00:06:33.163 EAL: No free 2048 kB hugepages reported on node 1 00:06:33.423 [2024-07-13 10:37:49.576967] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 3 00:06:33.423 [2024-07-13 10:37:49.615708] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:33.423 [2024-07-13 10:37:49.615844] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:33.423 [2024-07-13 10:37:49.615940] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:33.423 [2024-07-13 10:37:49.615942] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:33.991 10:37:50 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:33.991 10:37:50 -- common/autotest_common.sh@852 -- # return 0 00:06:33.991 10:37:50 -- event/cpu_locks.sh@136 -- # spdk_tgt_pid2=1976180 00:06:33.991 10:37:50 -- event/cpu_locks.sh@137 -- # NOT waitforlisten 1976180 /var/tmp/spdk2.sock 00:06:33.991 10:37:50 -- event/cpu_locks.sh@135 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1c -r /var/tmp/spdk2.sock 00:06:33.991 10:37:50 -- common/autotest_common.sh@640 -- # local es=0 00:06:33.991 10:37:50 -- common/autotest_common.sh@642 -- # valid_exec_arg waitforlisten 1976180 /var/tmp/spdk2.sock 00:06:33.991 10:37:50 -- common/autotest_common.sh@628 -- # local arg=waitforlisten 00:06:33.991 10:37:50 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:33.991 10:37:50 -- common/autotest_common.sh@632 -- # type -t waitforlisten 00:06:33.991 10:37:50 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:33.991 10:37:50 -- common/autotest_common.sh@643 -- # waitforlisten 1976180 /var/tmp/spdk2.sock 00:06:33.991 10:37:50 -- common/autotest_common.sh@819 -- # '[' -z 1976180 ']' 00:06:33.991 10:37:50 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:33.991 10:37:50 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:33.991 10:37:50 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:33.991 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:33.991 10:37:50 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:33.991 10:37:50 -- common/autotest_common.sh@10 -- # set +x 00:06:33.991 [2024-07-13 10:37:50.356341] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:33.991 [2024-07-13 10:37:50.356403] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1c --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1976180 ] 00:06:34.250 EAL: No free 2048 kB hugepages reported on node 1 00:06:34.251 [2024-07-13 10:37:50.450898] app.c: 666:claim_cpu_cores: *ERROR*: Cannot create lock on core 2, probably process 1976156 has claimed it. 00:06:34.251 [2024-07-13 10:37:50.450938] app.c: 791:spdk_app_start: *ERROR*: Unable to acquire lock on assigned core mask - exiting. 00:06:34.819 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 834: kill: (1976180) - No such process 00:06:34.819 ERROR: process (pid: 1976180) is no longer running 00:06:34.819 10:37:51 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:34.819 10:37:51 -- common/autotest_common.sh@852 -- # return 1 00:06:34.819 10:37:51 -- common/autotest_common.sh@643 -- # es=1 00:06:34.819 10:37:51 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:06:34.819 10:37:51 -- common/autotest_common.sh@662 -- # [[ -n '' ]] 00:06:34.819 10:37:51 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:06:34.819 10:37:51 -- event/cpu_locks.sh@139 -- # check_remaining_locks 00:06:34.819 10:37:51 -- event/cpu_locks.sh@36 -- # locks=(/var/tmp/spdk_cpu_lock_*) 00:06:34.819 10:37:51 -- event/cpu_locks.sh@37 -- # locks_expected=(/var/tmp/spdk_cpu_lock_{000..002}) 00:06:34.819 10:37:51 -- event/cpu_locks.sh@38 -- # [[ /var/tmp/spdk_cpu_lock_000 /var/tmp/spdk_cpu_lock_001 /var/tmp/spdk_cpu_lock_002 == \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\0\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\1\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\2 ]] 00:06:34.819 10:37:51 -- event/cpu_locks.sh@141 -- # killprocess 1976156 00:06:34.819 10:37:51 -- common/autotest_common.sh@926 -- # '[' -z 1976156 ']' 00:06:34.819 10:37:51 -- common/autotest_common.sh@930 -- # kill -0 1976156 00:06:34.819 10:37:51 -- common/autotest_common.sh@931 -- # uname 00:06:34.819 10:37:51 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:06:34.819 10:37:51 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 1976156 00:06:34.819 10:37:51 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:06:34.819 10:37:51 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:06:34.819 10:37:51 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 1976156' 00:06:34.819 killing process with pid 1976156 00:06:34.819 10:37:51 -- common/autotest_common.sh@945 -- # kill 1976156 00:06:34.819 10:37:51 -- common/autotest_common.sh@950 -- # wait 1976156 00:06:35.077 00:06:35.077 real 0m1.871s 00:06:35.077 user 0m5.347s 00:06:35.077 sys 0m0.470s 00:06:35.077 10:37:51 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:35.077 10:37:51 -- common/autotest_common.sh@10 -- # set +x 00:06:35.077 ************************************ 00:06:35.077 END TEST locking_overlapped_coremask 00:06:35.077 ************************************ 00:06:35.077 10:37:51 -- event/cpu_locks.sh@172 -- # run_test locking_overlapped_coremask_via_rpc locking_overlapped_coremask_via_rpc 00:06:35.077 10:37:51 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:06:35.077 10:37:51 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:35.077 10:37:51 -- common/autotest_common.sh@10 -- # set +x 00:06:35.077 ************************************ 00:06:35.077 START TEST locking_overlapped_coremask_via_rpc 00:06:35.077 ************************************ 00:06:35.077 10:37:51 -- common/autotest_common.sh@1104 -- # locking_overlapped_coremask_via_rpc 00:06:35.077 10:37:51 -- event/cpu_locks.sh@148 -- # spdk_tgt_pid=1976468 00:06:35.077 10:37:51 -- event/cpu_locks.sh@149 -- # waitforlisten 1976468 /var/tmp/spdk.sock 00:06:35.077 10:37:51 -- event/cpu_locks.sh@147 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x7 --disable-cpumask-locks 00:06:35.077 10:37:51 -- common/autotest_common.sh@819 -- # '[' -z 1976468 ']' 00:06:35.077 10:37:51 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:35.077 10:37:51 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:35.077 10:37:51 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:35.077 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:35.077 10:37:51 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:35.077 10:37:51 -- common/autotest_common.sh@10 -- # set +x 00:06:35.077 [2024-07-13 10:37:51.429205] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:35.077 [2024-07-13 10:37:51.429276] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1976468 ] 00:06:35.077 EAL: No free 2048 kB hugepages reported on node 1 00:06:35.336 [2024-07-13 10:37:51.497281] app.c: 795:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:06:35.336 [2024-07-13 10:37:51.497307] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 3 00:06:35.336 [2024-07-13 10:37:51.536040] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:35.336 [2024-07-13 10:37:51.536174] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:35.336 [2024-07-13 10:37:51.536268] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:35.336 [2024-07-13 10:37:51.536270] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:35.903 10:37:52 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:35.903 10:37:52 -- common/autotest_common.sh@852 -- # return 0 00:06:35.903 10:37:52 -- event/cpu_locks.sh@152 -- # spdk_tgt_pid2=1976587 00:06:35.903 10:37:52 -- event/cpu_locks.sh@153 -- # waitforlisten 1976587 /var/tmp/spdk2.sock 00:06:35.903 10:37:52 -- event/cpu_locks.sh@151 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1c -r /var/tmp/spdk2.sock --disable-cpumask-locks 00:06:35.903 10:37:52 -- common/autotest_common.sh@819 -- # '[' -z 1976587 ']' 00:06:35.903 10:37:52 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:35.903 10:37:52 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:35.903 10:37:52 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:35.903 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:35.903 10:37:52 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:35.903 10:37:52 -- common/autotest_common.sh@10 -- # set +x 00:06:35.903 [2024-07-13 10:37:52.282348] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:35.903 [2024-07-13 10:37:52.282416] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1c --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1976587 ] 00:06:36.162 EAL: No free 2048 kB hugepages reported on node 1 00:06:36.162 [2024-07-13 10:37:52.376422] app.c: 795:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:06:36.162 [2024-07-13 10:37:52.376456] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 3 00:06:36.162 [2024-07-13 10:37:52.455392] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:36.162 [2024-07-13 10:37:52.455551] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:06:36.162 [2024-07-13 10:37:52.459489] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:36.162 [2024-07-13 10:37:52.459491] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 4 00:06:36.728 10:37:53 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:36.728 10:37:53 -- common/autotest_common.sh@852 -- # return 0 00:06:36.728 10:37:53 -- event/cpu_locks.sh@155 -- # rpc_cmd framework_enable_cpumask_locks 00:06:36.728 10:37:53 -- common/autotest_common.sh@551 -- # xtrace_disable 00:06:36.728 10:37:53 -- common/autotest_common.sh@10 -- # set +x 00:06:36.985 10:37:53 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:06:36.985 10:37:53 -- event/cpu_locks.sh@156 -- # NOT rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:06:36.985 10:37:53 -- common/autotest_common.sh@640 -- # local es=0 00:06:36.985 10:37:53 -- common/autotest_common.sh@642 -- # valid_exec_arg rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:06:36.985 10:37:53 -- common/autotest_common.sh@628 -- # local arg=rpc_cmd 00:06:36.985 10:37:53 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:36.985 10:37:53 -- common/autotest_common.sh@632 -- # type -t rpc_cmd 00:06:36.985 10:37:53 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:36.985 10:37:53 -- common/autotest_common.sh@643 -- # rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:06:36.985 10:37:53 -- common/autotest_common.sh@551 -- # xtrace_disable 00:06:36.985 10:37:53 -- common/autotest_common.sh@10 -- # set +x 00:06:36.985 [2024-07-13 10:37:53.129502] app.c: 666:claim_cpu_cores: *ERROR*: Cannot create lock on core 2, probably process 1976468 has claimed it. 00:06:36.985 request: 00:06:36.985 { 00:06:36.985 "method": "framework_enable_cpumask_locks", 00:06:36.985 "req_id": 1 00:06:36.985 } 00:06:36.985 Got JSON-RPC error response 00:06:36.985 response: 00:06:36.985 { 00:06:36.985 "code": -32603, 00:06:36.985 "message": "Failed to claim CPU core: 2" 00:06:36.985 } 00:06:36.985 10:37:53 -- common/autotest_common.sh@579 -- # [[ 1 == 0 ]] 00:06:36.985 10:37:53 -- common/autotest_common.sh@643 -- # es=1 00:06:36.985 10:37:53 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:06:36.985 10:37:53 -- common/autotest_common.sh@662 -- # [[ -n '' ]] 00:06:36.985 10:37:53 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:06:36.985 10:37:53 -- event/cpu_locks.sh@158 -- # waitforlisten 1976468 /var/tmp/spdk.sock 00:06:36.985 10:37:53 -- common/autotest_common.sh@819 -- # '[' -z 1976468 ']' 00:06:36.985 10:37:53 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:36.985 10:37:53 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:36.985 10:37:53 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:36.985 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:36.985 10:37:53 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:36.985 10:37:53 -- common/autotest_common.sh@10 -- # set +x 00:06:36.985 10:37:53 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:36.985 10:37:53 -- common/autotest_common.sh@852 -- # return 0 00:06:36.985 10:37:53 -- event/cpu_locks.sh@159 -- # waitforlisten 1976587 /var/tmp/spdk2.sock 00:06:36.985 10:37:53 -- common/autotest_common.sh@819 -- # '[' -z 1976587 ']' 00:06:36.985 10:37:53 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:36.985 10:37:53 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:36.985 10:37:53 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:36.985 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:36.985 10:37:53 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:36.985 10:37:53 -- common/autotest_common.sh@10 -- # set +x 00:06:37.244 10:37:53 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:37.244 10:37:53 -- common/autotest_common.sh@852 -- # return 0 00:06:37.244 10:37:53 -- event/cpu_locks.sh@161 -- # check_remaining_locks 00:06:37.244 10:37:53 -- event/cpu_locks.sh@36 -- # locks=(/var/tmp/spdk_cpu_lock_*) 00:06:37.244 10:37:53 -- event/cpu_locks.sh@37 -- # locks_expected=(/var/tmp/spdk_cpu_lock_{000..002}) 00:06:37.244 10:37:53 -- event/cpu_locks.sh@38 -- # [[ /var/tmp/spdk_cpu_lock_000 /var/tmp/spdk_cpu_lock_001 /var/tmp/spdk_cpu_lock_002 == \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\0\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\1\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\2 ]] 00:06:37.244 00:06:37.244 real 0m2.091s 00:06:37.244 user 0m0.830s 00:06:37.244 sys 0m0.191s 00:06:37.244 10:37:53 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:37.244 10:37:53 -- common/autotest_common.sh@10 -- # set +x 00:06:37.244 ************************************ 00:06:37.244 END TEST locking_overlapped_coremask_via_rpc 00:06:37.244 ************************************ 00:06:37.244 10:37:53 -- event/cpu_locks.sh@174 -- # cleanup 00:06:37.244 10:37:53 -- event/cpu_locks.sh@15 -- # [[ -z 1976468 ]] 00:06:37.244 10:37:53 -- event/cpu_locks.sh@15 -- # killprocess 1976468 00:06:37.244 10:37:53 -- common/autotest_common.sh@926 -- # '[' -z 1976468 ']' 00:06:37.244 10:37:53 -- common/autotest_common.sh@930 -- # kill -0 1976468 00:06:37.244 10:37:53 -- common/autotest_common.sh@931 -- # uname 00:06:37.244 10:37:53 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:06:37.244 10:37:53 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 1976468 00:06:37.244 10:37:53 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:06:37.244 10:37:53 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:06:37.244 10:37:53 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 1976468' 00:06:37.244 killing process with pid 1976468 00:06:37.244 10:37:53 -- common/autotest_common.sh@945 -- # kill 1976468 00:06:37.244 10:37:53 -- common/autotest_common.sh@950 -- # wait 1976468 00:06:37.502 10:37:53 -- event/cpu_locks.sh@16 -- # [[ -z 1976587 ]] 00:06:37.502 10:37:53 -- event/cpu_locks.sh@16 -- # killprocess 1976587 00:06:37.502 10:37:53 -- common/autotest_common.sh@926 -- # '[' -z 1976587 ']' 00:06:37.502 10:37:53 -- common/autotest_common.sh@930 -- # kill -0 1976587 00:06:37.502 10:37:53 -- common/autotest_common.sh@931 -- # uname 00:06:37.761 10:37:53 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:06:37.761 10:37:53 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 1976587 00:06:37.761 10:37:53 -- common/autotest_common.sh@932 -- # process_name=reactor_2 00:06:37.761 10:37:53 -- common/autotest_common.sh@936 -- # '[' reactor_2 = sudo ']' 00:06:37.761 10:37:53 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 1976587' 00:06:37.761 killing process with pid 1976587 00:06:37.761 10:37:53 -- common/autotest_common.sh@945 -- # kill 1976587 00:06:37.761 10:37:53 -- common/autotest_common.sh@950 -- # wait 1976587 00:06:38.020 10:37:54 -- event/cpu_locks.sh@18 -- # rm -f 00:06:38.020 10:37:54 -- event/cpu_locks.sh@1 -- # cleanup 00:06:38.020 10:37:54 -- event/cpu_locks.sh@15 -- # [[ -z 1976468 ]] 00:06:38.020 10:37:54 -- event/cpu_locks.sh@15 -- # killprocess 1976468 00:06:38.020 10:37:54 -- common/autotest_common.sh@926 -- # '[' -z 1976468 ']' 00:06:38.020 10:37:54 -- common/autotest_common.sh@930 -- # kill -0 1976468 00:06:38.020 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 930: kill: (1976468) - No such process 00:06:38.020 10:37:54 -- common/autotest_common.sh@953 -- # echo 'Process with pid 1976468 is not found' 00:06:38.020 Process with pid 1976468 is not found 00:06:38.020 10:37:54 -- event/cpu_locks.sh@16 -- # [[ -z 1976587 ]] 00:06:38.020 10:37:54 -- event/cpu_locks.sh@16 -- # killprocess 1976587 00:06:38.020 10:37:54 -- common/autotest_common.sh@926 -- # '[' -z 1976587 ']' 00:06:38.020 10:37:54 -- common/autotest_common.sh@930 -- # kill -0 1976587 00:06:38.020 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 930: kill: (1976587) - No such process 00:06:38.020 10:37:54 -- common/autotest_common.sh@953 -- # echo 'Process with pid 1976587 is not found' 00:06:38.020 Process with pid 1976587 is not found 00:06:38.020 10:37:54 -- event/cpu_locks.sh@18 -- # rm -f 00:06:38.020 00:06:38.020 real 0m18.258s 00:06:38.020 user 0m30.684s 00:06:38.020 sys 0m5.973s 00:06:38.020 10:37:54 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:38.020 10:37:54 -- common/autotest_common.sh@10 -- # set +x 00:06:38.020 ************************************ 00:06:38.020 END TEST cpu_locks 00:06:38.020 ************************************ 00:06:38.020 00:06:38.020 real 0m42.321s 00:06:38.020 user 1m17.900s 00:06:38.020 sys 0m9.931s 00:06:38.020 10:37:54 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:38.020 10:37:54 -- common/autotest_common.sh@10 -- # set +x 00:06:38.020 ************************************ 00:06:38.020 END TEST event 00:06:38.020 ************************************ 00:06:38.020 10:37:54 -- spdk/autotest.sh@188 -- # run_test thread /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/thread.sh 00:06:38.020 10:37:54 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:06:38.020 10:37:54 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:38.020 10:37:54 -- common/autotest_common.sh@10 -- # set +x 00:06:38.020 ************************************ 00:06:38.020 START TEST thread 00:06:38.020 ************************************ 00:06:38.020 10:37:54 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/thread.sh 00:06:38.279 * Looking for test storage... 00:06:38.279 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread 00:06:38.279 10:37:54 -- thread/thread.sh@11 -- # run_test thread_poller_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:06:38.279 10:37:54 -- common/autotest_common.sh@1077 -- # '[' 8 -le 1 ']' 00:06:38.279 10:37:54 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:38.279 10:37:54 -- common/autotest_common.sh@10 -- # set +x 00:06:38.279 ************************************ 00:06:38.279 START TEST thread_poller_perf 00:06:38.279 ************************************ 00:06:38.279 10:37:54 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:06:38.279 [2024-07-13 10:37:54.454229] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:38.279 [2024-07-13 10:37:54.454329] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1977113 ] 00:06:38.279 EAL: No free 2048 kB hugepages reported on node 1 00:06:38.280 [2024-07-13 10:37:54.524334] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:38.280 [2024-07-13 10:37:54.560670] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:38.280 Running 1000 pollers for 1 seconds with 1 microseconds period. 00:06:39.658 ====================================== 00:06:39.658 busy:2504640208 (cyc) 00:06:39.658 total_run_count: 800000 00:06:39.658 tsc_hz: 2500000000 (cyc) 00:06:39.658 ====================================== 00:06:39.658 poller_cost: 3130 (cyc), 1252 (nsec) 00:06:39.658 00:06:39.658 real 0m1.183s 00:06:39.658 user 0m1.085s 00:06:39.658 sys 0m0.094s 00:06:39.658 10:37:55 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:39.658 10:37:55 -- common/autotest_common.sh@10 -- # set +x 00:06:39.658 ************************************ 00:06:39.658 END TEST thread_poller_perf 00:06:39.658 ************************************ 00:06:39.658 10:37:55 -- thread/thread.sh@12 -- # run_test thread_poller_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:06:39.658 10:37:55 -- common/autotest_common.sh@1077 -- # '[' 8 -le 1 ']' 00:06:39.658 10:37:55 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:39.658 10:37:55 -- common/autotest_common.sh@10 -- # set +x 00:06:39.658 ************************************ 00:06:39.658 START TEST thread_poller_perf 00:06:39.658 ************************************ 00:06:39.658 10:37:55 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:06:39.658 [2024-07-13 10:37:55.678922] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:39.658 [2024-07-13 10:37:55.679029] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1977395 ] 00:06:39.658 EAL: No free 2048 kB hugepages reported on node 1 00:06:39.658 [2024-07-13 10:37:55.749362] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:39.658 [2024-07-13 10:37:55.783904] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:39.658 Running 1000 pollers for 1 seconds with 0 microseconds period. 00:06:40.595 ====================================== 00:06:40.595 busy:2501795806 (cyc) 00:06:40.595 total_run_count: 14022000 00:06:40.595 tsc_hz: 2500000000 (cyc) 00:06:40.595 ====================================== 00:06:40.595 poller_cost: 178 (cyc), 71 (nsec) 00:06:40.595 00:06:40.595 real 0m1.174s 00:06:40.595 user 0m1.079s 00:06:40.595 sys 0m0.092s 00:06:40.595 10:37:56 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:40.595 10:37:56 -- common/autotest_common.sh@10 -- # set +x 00:06:40.595 ************************************ 00:06:40.595 END TEST thread_poller_perf 00:06:40.595 ************************************ 00:06:40.595 10:37:56 -- thread/thread.sh@17 -- # [[ n != \y ]] 00:06:40.595 10:37:56 -- thread/thread.sh@18 -- # run_test thread_spdk_lock /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock 00:06:40.595 10:37:56 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:06:40.595 10:37:56 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:40.595 10:37:56 -- common/autotest_common.sh@10 -- # set +x 00:06:40.595 ************************************ 00:06:40.595 START TEST thread_spdk_lock 00:06:40.595 ************************************ 00:06:40.595 10:37:56 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock 00:06:40.595 [2024-07-13 10:37:56.894735] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:40.595 [2024-07-13 10:37:56.894857] [ DPDK EAL parameters: spdk_lock_test --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1977552 ] 00:06:40.595 EAL: No free 2048 kB hugepages reported on node 1 00:06:40.595 [2024-07-13 10:37:56.965408] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:40.853 [2024-07-13 10:37:57.001084] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:40.853 [2024-07-13 10:37:57.001086] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:41.112 [2024-07-13 10:37:57.496435] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 955:thread_execute_poller: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:06:41.112 [2024-07-13 10:37:57.496479] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:3062:spdk_spin_lock: *ERROR*: unrecoverable spinlock error 2: Deadlock detected (thread != sspin->thread) 00:06:41.112 [2024-07-13 10:37:57.496490] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:3017:sspin_stacks_print: *ERROR*: spinlock 0x133de80 00:06:41.112 [2024-07-13 10:37:57.497279] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 850:msg_queue_run_batch: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:06:41.112 [2024-07-13 10:37:57.497383] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:1016:thread_execute_timed_poller: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:06:41.112 [2024-07-13 10:37:57.497403] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 850:msg_queue_run_batch: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:06:41.371 Starting test contend 00:06:41.371 Worker Delay Wait us Hold us Total us 00:06:41.371 0 3 170404 190354 360759 00:06:41.371 1 5 83703 291795 375499 00:06:41.371 PASS test contend 00:06:41.371 Starting test hold_by_poller 00:06:41.371 PASS test hold_by_poller 00:06:41.371 Starting test hold_by_message 00:06:41.371 PASS test hold_by_message 00:06:41.371 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock summary: 00:06:41.371 100014 assertions passed 00:06:41.371 0 assertions failed 00:06:41.371 00:06:41.371 real 0m0.671s 00:06:41.371 user 0m1.078s 00:06:41.371 sys 0m0.085s 00:06:41.371 10:37:57 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:41.371 10:37:57 -- common/autotest_common.sh@10 -- # set +x 00:06:41.371 ************************************ 00:06:41.371 END TEST thread_spdk_lock 00:06:41.371 ************************************ 00:06:41.371 00:06:41.371 real 0m3.253s 00:06:41.371 user 0m3.331s 00:06:41.371 sys 0m0.432s 00:06:41.371 10:37:57 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:41.371 10:37:57 -- common/autotest_common.sh@10 -- # set +x 00:06:41.371 ************************************ 00:06:41.371 END TEST thread 00:06:41.371 ************************************ 00:06:41.371 10:37:57 -- spdk/autotest.sh@189 -- # run_test accel /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel.sh 00:06:41.371 10:37:57 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:06:41.371 10:37:57 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:41.371 10:37:57 -- common/autotest_common.sh@10 -- # set +x 00:06:41.371 ************************************ 00:06:41.371 START TEST accel 00:06:41.371 ************************************ 00:06:41.371 10:37:57 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel.sh 00:06:41.371 * Looking for test storage... 00:06:41.371 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel 00:06:41.371 10:37:57 -- accel/accel.sh@73 -- # declare -A expected_opcs 00:06:41.371 10:37:57 -- accel/accel.sh@74 -- # get_expected_opcs 00:06:41.371 10:37:57 -- accel/accel.sh@57 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:06:41.371 10:37:57 -- accel/accel.sh@59 -- # spdk_tgt_pid=1977752 00:06:41.371 10:37:57 -- accel/accel.sh@60 -- # waitforlisten 1977752 00:06:41.371 10:37:57 -- common/autotest_common.sh@819 -- # '[' -z 1977752 ']' 00:06:41.371 10:37:57 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:41.371 10:37:57 -- accel/accel.sh@58 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -c /dev/fd/63 00:06:41.371 10:37:57 -- common/autotest_common.sh@824 -- # local max_retries=100 00:06:41.371 10:37:57 -- accel/accel.sh@58 -- # build_accel_config 00:06:41.371 10:37:57 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:41.371 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:41.371 10:37:57 -- common/autotest_common.sh@828 -- # xtrace_disable 00:06:41.371 10:37:57 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:41.371 10:37:57 -- common/autotest_common.sh@10 -- # set +x 00:06:41.371 10:37:57 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:41.371 10:37:57 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:41.371 10:37:57 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:41.371 10:37:57 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:41.371 10:37:57 -- accel/accel.sh@41 -- # local IFS=, 00:06:41.371 10:37:57 -- accel/accel.sh@42 -- # jq -r . 00:06:41.630 [2024-07-13 10:37:57.760672] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:41.630 [2024-07-13 10:37:57.760759] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1977752 ] 00:06:41.630 EAL: No free 2048 kB hugepages reported on node 1 00:06:41.630 [2024-07-13 10:37:57.830719] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:41.630 [2024-07-13 10:37:57.867081] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:41.630 [2024-07-13 10:37:57.867184] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:42.198 10:37:58 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:06:42.198 10:37:58 -- common/autotest_common.sh@852 -- # return 0 00:06:42.198 10:37:58 -- accel/accel.sh@62 -- # exp_opcs=($($rpc_py accel_get_opc_assignments | jq -r ". | to_entries | map(\"\(.key)=\(.value)\") | .[]")) 00:06:42.198 10:37:58 -- accel/accel.sh@62 -- # rpc_cmd accel_get_opc_assignments 00:06:42.198 10:37:58 -- accel/accel.sh@62 -- # jq -r '. | to_entries | map("\(.key)=\(.value)") | .[]' 00:06:42.198 10:37:58 -- common/autotest_common.sh@551 -- # xtrace_disable 00:06:42.198 10:37:58 -- common/autotest_common.sh@10 -- # set +x 00:06:42.198 10:37:58 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:06:42.457 10:37:58 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:42.457 10:37:58 -- accel/accel.sh@64 -- # IFS== 00:06:42.457 10:37:58 -- accel/accel.sh@64 -- # read -r opc module 00:06:42.457 10:37:58 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:42.457 10:37:58 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:42.457 10:37:58 -- accel/accel.sh@64 -- # IFS== 00:06:42.457 10:37:58 -- accel/accel.sh@64 -- # read -r opc module 00:06:42.457 10:37:58 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:42.457 10:37:58 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:42.457 10:37:58 -- accel/accel.sh@64 -- # IFS== 00:06:42.457 10:37:58 -- accel/accel.sh@64 -- # read -r opc module 00:06:42.457 10:37:58 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:42.457 10:37:58 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:42.457 10:37:58 -- accel/accel.sh@64 -- # IFS== 00:06:42.458 10:37:58 -- accel/accel.sh@64 -- # read -r opc module 00:06:42.458 10:37:58 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:42.458 10:37:58 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:42.458 10:37:58 -- accel/accel.sh@64 -- # IFS== 00:06:42.458 10:37:58 -- accel/accel.sh@64 -- # read -r opc module 00:06:42.458 10:37:58 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:42.458 10:37:58 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:42.458 10:37:58 -- accel/accel.sh@64 -- # IFS== 00:06:42.458 10:37:58 -- accel/accel.sh@64 -- # read -r opc module 00:06:42.458 10:37:58 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:42.458 10:37:58 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:42.458 10:37:58 -- accel/accel.sh@64 -- # IFS== 00:06:42.458 10:37:58 -- accel/accel.sh@64 -- # read -r opc module 00:06:42.458 10:37:58 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:42.458 10:37:58 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:42.458 10:37:58 -- accel/accel.sh@64 -- # IFS== 00:06:42.458 10:37:58 -- accel/accel.sh@64 -- # read -r opc module 00:06:42.458 10:37:58 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:42.458 10:37:58 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:42.458 10:37:58 -- accel/accel.sh@64 -- # IFS== 00:06:42.458 10:37:58 -- accel/accel.sh@64 -- # read -r opc module 00:06:42.458 10:37:58 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:42.458 10:37:58 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:42.458 10:37:58 -- accel/accel.sh@64 -- # IFS== 00:06:42.458 10:37:58 -- accel/accel.sh@64 -- # read -r opc module 00:06:42.458 10:37:58 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:42.458 10:37:58 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:42.458 10:37:58 -- accel/accel.sh@64 -- # IFS== 00:06:42.458 10:37:58 -- accel/accel.sh@64 -- # read -r opc module 00:06:42.458 10:37:58 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:42.458 10:37:58 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:42.458 10:37:58 -- accel/accel.sh@64 -- # IFS== 00:06:42.458 10:37:58 -- accel/accel.sh@64 -- # read -r opc module 00:06:42.458 10:37:58 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:42.458 10:37:58 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:42.458 10:37:58 -- accel/accel.sh@64 -- # IFS== 00:06:42.458 10:37:58 -- accel/accel.sh@64 -- # read -r opc module 00:06:42.458 10:37:58 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:42.458 10:37:58 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:42.458 10:37:58 -- accel/accel.sh@64 -- # IFS== 00:06:42.458 10:37:58 -- accel/accel.sh@64 -- # read -r opc module 00:06:42.458 10:37:58 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:42.458 10:37:58 -- accel/accel.sh@67 -- # killprocess 1977752 00:06:42.458 10:37:58 -- common/autotest_common.sh@926 -- # '[' -z 1977752 ']' 00:06:42.458 10:37:58 -- common/autotest_common.sh@930 -- # kill -0 1977752 00:06:42.458 10:37:58 -- common/autotest_common.sh@931 -- # uname 00:06:42.458 10:37:58 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:06:42.458 10:37:58 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 1977752 00:06:42.458 10:37:58 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:06:42.458 10:37:58 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:06:42.458 10:37:58 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 1977752' 00:06:42.458 killing process with pid 1977752 00:06:42.458 10:37:58 -- common/autotest_common.sh@945 -- # kill 1977752 00:06:42.458 10:37:58 -- common/autotest_common.sh@950 -- # wait 1977752 00:06:42.717 10:37:58 -- accel/accel.sh@68 -- # trap - ERR 00:06:42.717 10:37:58 -- accel/accel.sh@81 -- # run_test accel_help accel_perf -h 00:06:42.717 10:37:58 -- common/autotest_common.sh@1077 -- # '[' 3 -le 1 ']' 00:06:42.717 10:37:58 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:42.717 10:37:58 -- common/autotest_common.sh@10 -- # set +x 00:06:42.717 10:37:58 -- common/autotest_common.sh@1104 -- # accel_perf -h 00:06:42.717 10:37:58 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -h 00:06:42.717 10:37:58 -- accel/accel.sh@12 -- # build_accel_config 00:06:42.717 10:37:58 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:42.717 10:37:58 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:42.717 10:37:58 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:42.717 10:37:58 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:42.717 10:37:58 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:42.717 10:37:58 -- accel/accel.sh@41 -- # local IFS=, 00:06:42.717 10:37:58 -- accel/accel.sh@42 -- # jq -r . 00:06:42.717 10:37:58 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:42.717 10:37:58 -- common/autotest_common.sh@10 -- # set +x 00:06:42.717 10:37:59 -- accel/accel.sh@83 -- # run_test accel_missing_filename NOT accel_perf -t 1 -w compress 00:06:42.717 10:37:59 -- common/autotest_common.sh@1077 -- # '[' 7 -le 1 ']' 00:06:42.717 10:37:59 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:42.717 10:37:59 -- common/autotest_common.sh@10 -- # set +x 00:06:42.717 ************************************ 00:06:42.717 START TEST accel_missing_filename 00:06:42.717 ************************************ 00:06:42.717 10:37:59 -- common/autotest_common.sh@1104 -- # NOT accel_perf -t 1 -w compress 00:06:42.717 10:37:59 -- common/autotest_common.sh@640 -- # local es=0 00:06:42.717 10:37:59 -- common/autotest_common.sh@642 -- # valid_exec_arg accel_perf -t 1 -w compress 00:06:42.717 10:37:59 -- common/autotest_common.sh@628 -- # local arg=accel_perf 00:06:42.717 10:37:59 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:42.717 10:37:59 -- common/autotest_common.sh@632 -- # type -t accel_perf 00:06:42.717 10:37:59 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:42.717 10:37:59 -- common/autotest_common.sh@643 -- # accel_perf -t 1 -w compress 00:06:42.717 10:37:59 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress 00:06:42.717 10:37:59 -- accel/accel.sh@12 -- # build_accel_config 00:06:42.717 10:37:59 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:42.717 10:37:59 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:42.717 10:37:59 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:42.717 10:37:59 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:42.717 10:37:59 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:42.717 10:37:59 -- accel/accel.sh@41 -- # local IFS=, 00:06:42.717 10:37:59 -- accel/accel.sh@42 -- # jq -r . 00:06:42.717 [2024-07-13 10:37:59.064394] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:42.717 [2024-07-13 10:37:59.064489] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1978056 ] 00:06:42.717 EAL: No free 2048 kB hugepages reported on node 1 00:06:42.977 [2024-07-13 10:37:59.133614] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:42.977 [2024-07-13 10:37:59.168665] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:42.977 [2024-07-13 10:37:59.207251] app.c: 910:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:06:42.977 [2024-07-13 10:37:59.267182] accel_perf.c:1385:main: *ERROR*: ERROR starting application 00:06:42.977 A filename is required. 00:06:42.977 10:37:59 -- common/autotest_common.sh@643 -- # es=234 00:06:42.977 10:37:59 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:06:42.977 10:37:59 -- common/autotest_common.sh@652 -- # es=106 00:06:42.977 10:37:59 -- common/autotest_common.sh@653 -- # case "$es" in 00:06:42.977 10:37:59 -- common/autotest_common.sh@660 -- # es=1 00:06:42.977 10:37:59 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:06:42.977 00:06:42.977 real 0m0.283s 00:06:42.977 user 0m0.192s 00:06:42.977 sys 0m0.128s 00:06:42.977 10:37:59 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:42.977 10:37:59 -- common/autotest_common.sh@10 -- # set +x 00:06:42.977 ************************************ 00:06:42.977 END TEST accel_missing_filename 00:06:42.977 ************************************ 00:06:43.237 10:37:59 -- accel/accel.sh@85 -- # run_test accel_compress_verify NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:43.237 10:37:59 -- common/autotest_common.sh@1077 -- # '[' 10 -le 1 ']' 00:06:43.237 10:37:59 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:43.237 10:37:59 -- common/autotest_common.sh@10 -- # set +x 00:06:43.237 ************************************ 00:06:43.237 START TEST accel_compress_verify 00:06:43.237 ************************************ 00:06:43.237 10:37:59 -- common/autotest_common.sh@1104 -- # NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:43.237 10:37:59 -- common/autotest_common.sh@640 -- # local es=0 00:06:43.237 10:37:59 -- common/autotest_common.sh@642 -- # valid_exec_arg accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:43.237 10:37:59 -- common/autotest_common.sh@628 -- # local arg=accel_perf 00:06:43.237 10:37:59 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:43.237 10:37:59 -- common/autotest_common.sh@632 -- # type -t accel_perf 00:06:43.237 10:37:59 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:43.237 10:37:59 -- common/autotest_common.sh@643 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:43.237 10:37:59 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:43.237 10:37:59 -- accel/accel.sh@12 -- # build_accel_config 00:06:43.237 10:37:59 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:43.237 10:37:59 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:43.237 10:37:59 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:43.237 10:37:59 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:43.237 10:37:59 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:43.237 10:37:59 -- accel/accel.sh@41 -- # local IFS=, 00:06:43.237 10:37:59 -- accel/accel.sh@42 -- # jq -r . 00:06:43.237 [2024-07-13 10:37:59.396410] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:43.237 [2024-07-13 10:37:59.396511] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1978078 ] 00:06:43.237 EAL: No free 2048 kB hugepages reported on node 1 00:06:43.237 [2024-07-13 10:37:59.468269] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:43.237 [2024-07-13 10:37:59.504127] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:43.237 [2024-07-13 10:37:59.543651] app.c: 910:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:06:43.237 [2024-07-13 10:37:59.603276] accel_perf.c:1385:main: *ERROR*: ERROR starting application 00:06:43.497 00:06:43.497 Compression does not support the verify option, aborting. 00:06:43.497 10:37:59 -- common/autotest_common.sh@643 -- # es=161 00:06:43.497 10:37:59 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:06:43.497 10:37:59 -- common/autotest_common.sh@652 -- # es=33 00:06:43.497 10:37:59 -- common/autotest_common.sh@653 -- # case "$es" in 00:06:43.497 10:37:59 -- common/autotest_common.sh@660 -- # es=1 00:06:43.497 10:37:59 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:06:43.497 00:06:43.497 real 0m0.288s 00:06:43.497 user 0m0.193s 00:06:43.497 sys 0m0.134s 00:06:43.497 10:37:59 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:43.497 10:37:59 -- common/autotest_common.sh@10 -- # set +x 00:06:43.497 ************************************ 00:06:43.497 END TEST accel_compress_verify 00:06:43.497 ************************************ 00:06:43.497 10:37:59 -- accel/accel.sh@87 -- # run_test accel_wrong_workload NOT accel_perf -t 1 -w foobar 00:06:43.497 10:37:59 -- common/autotest_common.sh@1077 -- # '[' 7 -le 1 ']' 00:06:43.497 10:37:59 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:43.497 10:37:59 -- common/autotest_common.sh@10 -- # set +x 00:06:43.497 ************************************ 00:06:43.497 START TEST accel_wrong_workload 00:06:43.497 ************************************ 00:06:43.497 10:37:59 -- common/autotest_common.sh@1104 -- # NOT accel_perf -t 1 -w foobar 00:06:43.497 10:37:59 -- common/autotest_common.sh@640 -- # local es=0 00:06:43.497 10:37:59 -- common/autotest_common.sh@642 -- # valid_exec_arg accel_perf -t 1 -w foobar 00:06:43.497 10:37:59 -- common/autotest_common.sh@628 -- # local arg=accel_perf 00:06:43.497 10:37:59 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:43.497 10:37:59 -- common/autotest_common.sh@632 -- # type -t accel_perf 00:06:43.497 10:37:59 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:43.497 10:37:59 -- common/autotest_common.sh@643 -- # accel_perf -t 1 -w foobar 00:06:43.497 10:37:59 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w foobar 00:06:43.497 10:37:59 -- accel/accel.sh@12 -- # build_accel_config 00:06:43.497 10:37:59 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:43.497 10:37:59 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:43.497 10:37:59 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:43.497 10:37:59 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:43.497 10:37:59 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:43.497 10:37:59 -- accel/accel.sh@41 -- # local IFS=, 00:06:43.497 10:37:59 -- accel/accel.sh@42 -- # jq -r . 00:06:43.497 Unsupported workload type: foobar 00:06:43.497 [2024-07-13 10:37:59.729744] app.c:1292:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'w' failed: 1 00:06:43.497 accel_perf options: 00:06:43.497 [-h help message] 00:06:43.497 [-q queue depth per core] 00:06:43.497 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:06:43.497 [-T number of threads per core 00:06:43.497 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:06:43.497 [-t time in seconds] 00:06:43.497 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:06:43.497 [ dif_verify, , dif_generate, dif_generate_copy 00:06:43.497 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:06:43.497 [-l for compress/decompress workloads, name of uncompressed input file 00:06:43.497 [-S for crc32c workload, use this seed value (default 0) 00:06:43.497 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:06:43.497 [-f for fill workload, use this BYTE value (default 255) 00:06:43.497 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:06:43.497 [-y verify result if this switch is on] 00:06:43.497 [-a tasks to allocate per core (default: same value as -q)] 00:06:43.497 Can be used to spread operations across a wider range of memory. 00:06:43.497 10:37:59 -- common/autotest_common.sh@643 -- # es=1 00:06:43.497 10:37:59 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:06:43.497 10:37:59 -- common/autotest_common.sh@662 -- # [[ -n '' ]] 00:06:43.497 10:37:59 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:06:43.497 00:06:43.497 real 0m0.028s 00:06:43.497 user 0m0.011s 00:06:43.497 sys 0m0.017s 00:06:43.497 10:37:59 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:43.497 10:37:59 -- common/autotest_common.sh@10 -- # set +x 00:06:43.497 ************************************ 00:06:43.497 END TEST accel_wrong_workload 00:06:43.497 ************************************ 00:06:43.497 Error: writing output failed: Broken pipe 00:06:43.497 10:37:59 -- accel/accel.sh@89 -- # run_test accel_negative_buffers NOT accel_perf -t 1 -w xor -y -x -1 00:06:43.497 10:37:59 -- common/autotest_common.sh@1077 -- # '[' 10 -le 1 ']' 00:06:43.497 10:37:59 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:43.497 10:37:59 -- common/autotest_common.sh@10 -- # set +x 00:06:43.497 ************************************ 00:06:43.497 START TEST accel_negative_buffers 00:06:43.498 ************************************ 00:06:43.498 10:37:59 -- common/autotest_common.sh@1104 -- # NOT accel_perf -t 1 -w xor -y -x -1 00:06:43.498 10:37:59 -- common/autotest_common.sh@640 -- # local es=0 00:06:43.498 10:37:59 -- common/autotest_common.sh@642 -- # valid_exec_arg accel_perf -t 1 -w xor -y -x -1 00:06:43.498 10:37:59 -- common/autotest_common.sh@628 -- # local arg=accel_perf 00:06:43.498 10:37:59 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:43.498 10:37:59 -- common/autotest_common.sh@632 -- # type -t accel_perf 00:06:43.498 10:37:59 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:06:43.498 10:37:59 -- common/autotest_common.sh@643 -- # accel_perf -t 1 -w xor -y -x -1 00:06:43.498 10:37:59 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x -1 00:06:43.498 10:37:59 -- accel/accel.sh@12 -- # build_accel_config 00:06:43.498 10:37:59 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:43.498 10:37:59 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:43.498 10:37:59 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:43.498 10:37:59 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:43.498 10:37:59 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:43.498 10:37:59 -- accel/accel.sh@41 -- # local IFS=, 00:06:43.498 10:37:59 -- accel/accel.sh@42 -- # jq -r . 00:06:43.498 -x option must be non-negative. 00:06:43.498 [2024-07-13 10:37:59.805556] app.c:1292:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'x' failed: 1 00:06:43.498 accel_perf options: 00:06:43.498 [-h help message] 00:06:43.498 [-q queue depth per core] 00:06:43.498 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:06:43.498 [-T number of threads per core 00:06:43.498 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:06:43.498 [-t time in seconds] 00:06:43.498 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:06:43.498 [ dif_verify, , dif_generate, dif_generate_copy 00:06:43.498 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:06:43.498 [-l for compress/decompress workloads, name of uncompressed input file 00:06:43.498 [-S for crc32c workload, use this seed value (default 0) 00:06:43.498 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:06:43.498 [-f for fill workload, use this BYTE value (default 255) 00:06:43.498 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:06:43.498 [-y verify result if this switch is on] 00:06:43.498 [-a tasks to allocate per core (default: same value as -q)] 00:06:43.498 Can be used to spread operations across a wider range of memory. 00:06:43.498 10:37:59 -- common/autotest_common.sh@643 -- # es=1 00:06:43.498 10:37:59 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:06:43.498 10:37:59 -- common/autotest_common.sh@662 -- # [[ -n '' ]] 00:06:43.498 10:37:59 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:06:43.498 00:06:43.498 real 0m0.030s 00:06:43.498 user 0m0.013s 00:06:43.498 sys 0m0.017s 00:06:43.498 10:37:59 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:43.498 10:37:59 -- common/autotest_common.sh@10 -- # set +x 00:06:43.498 ************************************ 00:06:43.498 END TEST accel_negative_buffers 00:06:43.498 ************************************ 00:06:43.498 Error: writing output failed: Broken pipe 00:06:43.498 10:37:59 -- accel/accel.sh@93 -- # run_test accel_crc32c accel_test -t 1 -w crc32c -S 32 -y 00:06:43.498 10:37:59 -- common/autotest_common.sh@1077 -- # '[' 9 -le 1 ']' 00:06:43.498 10:37:59 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:43.498 10:37:59 -- common/autotest_common.sh@10 -- # set +x 00:06:43.498 ************************************ 00:06:43.498 START TEST accel_crc32c 00:06:43.498 ************************************ 00:06:43.498 10:37:59 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w crc32c -S 32 -y 00:06:43.498 10:37:59 -- accel/accel.sh@16 -- # local accel_opc 00:06:43.498 10:37:59 -- accel/accel.sh@17 -- # local accel_module 00:06:43.498 10:37:59 -- accel/accel.sh@18 -- # accel_perf -t 1 -w crc32c -S 32 -y 00:06:43.498 10:37:59 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -S 32 -y 00:06:43.498 10:37:59 -- accel/accel.sh@12 -- # build_accel_config 00:06:43.498 10:37:59 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:43.498 10:37:59 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:43.498 10:37:59 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:43.498 10:37:59 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:43.498 10:37:59 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:43.498 10:37:59 -- accel/accel.sh@41 -- # local IFS=, 00:06:43.498 10:37:59 -- accel/accel.sh@42 -- # jq -r . 00:06:43.498 [2024-07-13 10:37:59.881740] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:43.498 [2024-07-13 10:37:59.881823] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1978195 ] 00:06:43.757 EAL: No free 2048 kB hugepages reported on node 1 00:06:43.757 [2024-07-13 10:37:59.953690] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:43.757 [2024-07-13 10:37:59.990142] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:45.135 10:38:01 -- accel/accel.sh@18 -- # out=' 00:06:45.135 SPDK Configuration: 00:06:45.135 Core mask: 0x1 00:06:45.135 00:06:45.135 Accel Perf Configuration: 00:06:45.135 Workload Type: crc32c 00:06:45.135 CRC-32C seed: 32 00:06:45.135 Transfer size: 4096 bytes 00:06:45.135 Vector count 1 00:06:45.135 Module: software 00:06:45.135 Queue depth: 32 00:06:45.135 Allocate depth: 32 00:06:45.135 # threads/core: 1 00:06:45.135 Run time: 1 seconds 00:06:45.135 Verify: Yes 00:06:45.136 00:06:45.136 Running for 1 seconds... 00:06:45.136 00:06:45.136 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:45.136 ------------------------------------------------------------------------------------ 00:06:45.136 0,0 829920/s 3241 MiB/s 0 0 00:06:45.136 ==================================================================================== 00:06:45.136 Total 829920/s 3241 MiB/s 0 0' 00:06:45.136 10:38:01 -- accel/accel.sh@20 -- # IFS=: 00:06:45.136 10:38:01 -- accel/accel.sh@20 -- # read -r var val 00:06:45.136 10:38:01 -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -S 32 -y 00:06:45.136 10:38:01 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -S 32 -y 00:06:45.136 10:38:01 -- accel/accel.sh@12 -- # build_accel_config 00:06:45.136 10:38:01 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:45.136 10:38:01 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:45.136 10:38:01 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:45.136 10:38:01 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:45.136 10:38:01 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:45.136 10:38:01 -- accel/accel.sh@41 -- # local IFS=, 00:06:45.136 10:38:01 -- accel/accel.sh@42 -- # jq -r . 00:06:45.136 [2024-07-13 10:38:01.174916] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:45.136 [2024-07-13 10:38:01.175042] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1978408 ] 00:06:45.136 EAL: No free 2048 kB hugepages reported on node 1 00:06:45.136 [2024-07-13 10:38:01.243484] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:45.136 [2024-07-13 10:38:01.278037] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:45.136 10:38:01 -- accel/accel.sh@21 -- # val= 00:06:45.136 10:38:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.136 10:38:01 -- accel/accel.sh@20 -- # IFS=: 00:06:45.136 10:38:01 -- accel/accel.sh@20 -- # read -r var val 00:06:45.136 10:38:01 -- accel/accel.sh@21 -- # val= 00:06:45.136 10:38:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.136 10:38:01 -- accel/accel.sh@20 -- # IFS=: 00:06:45.136 10:38:01 -- accel/accel.sh@20 -- # read -r var val 00:06:45.136 10:38:01 -- accel/accel.sh@21 -- # val=0x1 00:06:45.136 10:38:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.136 10:38:01 -- accel/accel.sh@20 -- # IFS=: 00:06:45.136 10:38:01 -- accel/accel.sh@20 -- # read -r var val 00:06:45.136 10:38:01 -- accel/accel.sh@21 -- # val= 00:06:45.136 10:38:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.136 10:38:01 -- accel/accel.sh@20 -- # IFS=: 00:06:45.136 10:38:01 -- accel/accel.sh@20 -- # read -r var val 00:06:45.136 10:38:01 -- accel/accel.sh@21 -- # val= 00:06:45.136 10:38:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.136 10:38:01 -- accel/accel.sh@20 -- # IFS=: 00:06:45.136 10:38:01 -- accel/accel.sh@20 -- # read -r var val 00:06:45.136 10:38:01 -- accel/accel.sh@21 -- # val=crc32c 00:06:45.136 10:38:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.136 10:38:01 -- accel/accel.sh@24 -- # accel_opc=crc32c 00:06:45.136 10:38:01 -- accel/accel.sh@20 -- # IFS=: 00:06:45.136 10:38:01 -- accel/accel.sh@20 -- # read -r var val 00:06:45.136 10:38:01 -- accel/accel.sh@21 -- # val=32 00:06:45.136 10:38:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.136 10:38:01 -- accel/accel.sh@20 -- # IFS=: 00:06:45.136 10:38:01 -- accel/accel.sh@20 -- # read -r var val 00:06:45.136 10:38:01 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:45.136 10:38:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.136 10:38:01 -- accel/accel.sh@20 -- # IFS=: 00:06:45.136 10:38:01 -- accel/accel.sh@20 -- # read -r var val 00:06:45.136 10:38:01 -- accel/accel.sh@21 -- # val= 00:06:45.136 10:38:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.136 10:38:01 -- accel/accel.sh@20 -- # IFS=: 00:06:45.136 10:38:01 -- accel/accel.sh@20 -- # read -r var val 00:06:45.136 10:38:01 -- accel/accel.sh@21 -- # val=software 00:06:45.136 10:38:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.136 10:38:01 -- accel/accel.sh@23 -- # accel_module=software 00:06:45.136 10:38:01 -- accel/accel.sh@20 -- # IFS=: 00:06:45.136 10:38:01 -- accel/accel.sh@20 -- # read -r var val 00:06:45.136 10:38:01 -- accel/accel.sh@21 -- # val=32 00:06:45.136 10:38:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.136 10:38:01 -- accel/accel.sh@20 -- # IFS=: 00:06:45.136 10:38:01 -- accel/accel.sh@20 -- # read -r var val 00:06:45.136 10:38:01 -- accel/accel.sh@21 -- # val=32 00:06:45.136 10:38:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.136 10:38:01 -- accel/accel.sh@20 -- # IFS=: 00:06:45.136 10:38:01 -- accel/accel.sh@20 -- # read -r var val 00:06:45.136 10:38:01 -- accel/accel.sh@21 -- # val=1 00:06:45.136 10:38:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.136 10:38:01 -- accel/accel.sh@20 -- # IFS=: 00:06:45.136 10:38:01 -- accel/accel.sh@20 -- # read -r var val 00:06:45.136 10:38:01 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:45.136 10:38:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.136 10:38:01 -- accel/accel.sh@20 -- # IFS=: 00:06:45.136 10:38:01 -- accel/accel.sh@20 -- # read -r var val 00:06:45.136 10:38:01 -- accel/accel.sh@21 -- # val=Yes 00:06:45.136 10:38:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.136 10:38:01 -- accel/accel.sh@20 -- # IFS=: 00:06:45.136 10:38:01 -- accel/accel.sh@20 -- # read -r var val 00:06:45.136 10:38:01 -- accel/accel.sh@21 -- # val= 00:06:45.136 10:38:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.136 10:38:01 -- accel/accel.sh@20 -- # IFS=: 00:06:45.136 10:38:01 -- accel/accel.sh@20 -- # read -r var val 00:06:45.136 10:38:01 -- accel/accel.sh@21 -- # val= 00:06:45.136 10:38:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.136 10:38:01 -- accel/accel.sh@20 -- # IFS=: 00:06:45.136 10:38:01 -- accel/accel.sh@20 -- # read -r var val 00:06:46.078 10:38:02 -- accel/accel.sh@21 -- # val= 00:06:46.078 10:38:02 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.078 10:38:02 -- accel/accel.sh@20 -- # IFS=: 00:06:46.078 10:38:02 -- accel/accel.sh@20 -- # read -r var val 00:06:46.078 10:38:02 -- accel/accel.sh@21 -- # val= 00:06:46.078 10:38:02 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.078 10:38:02 -- accel/accel.sh@20 -- # IFS=: 00:06:46.078 10:38:02 -- accel/accel.sh@20 -- # read -r var val 00:06:46.078 10:38:02 -- accel/accel.sh@21 -- # val= 00:06:46.078 10:38:02 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.078 10:38:02 -- accel/accel.sh@20 -- # IFS=: 00:06:46.078 10:38:02 -- accel/accel.sh@20 -- # read -r var val 00:06:46.078 10:38:02 -- accel/accel.sh@21 -- # val= 00:06:46.078 10:38:02 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.078 10:38:02 -- accel/accel.sh@20 -- # IFS=: 00:06:46.078 10:38:02 -- accel/accel.sh@20 -- # read -r var val 00:06:46.078 10:38:02 -- accel/accel.sh@21 -- # val= 00:06:46.078 10:38:02 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.078 10:38:02 -- accel/accel.sh@20 -- # IFS=: 00:06:46.078 10:38:02 -- accel/accel.sh@20 -- # read -r var val 00:06:46.078 10:38:02 -- accel/accel.sh@21 -- # val= 00:06:46.078 10:38:02 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.078 10:38:02 -- accel/accel.sh@20 -- # IFS=: 00:06:46.078 10:38:02 -- accel/accel.sh@20 -- # read -r var val 00:06:46.078 10:38:02 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:46.078 10:38:02 -- accel/accel.sh@28 -- # [[ -n crc32c ]] 00:06:46.078 10:38:02 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:46.078 00:06:46.078 real 0m2.583s 00:06:46.078 user 0m2.321s 00:06:46.078 sys 0m0.274s 00:06:46.078 10:38:02 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:46.078 10:38:02 -- common/autotest_common.sh@10 -- # set +x 00:06:46.078 ************************************ 00:06:46.078 END TEST accel_crc32c 00:06:46.078 ************************************ 00:06:46.336 10:38:02 -- accel/accel.sh@94 -- # run_test accel_crc32c_C2 accel_test -t 1 -w crc32c -y -C 2 00:06:46.336 10:38:02 -- common/autotest_common.sh@1077 -- # '[' 9 -le 1 ']' 00:06:46.336 10:38:02 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:46.336 10:38:02 -- common/autotest_common.sh@10 -- # set +x 00:06:46.336 ************************************ 00:06:46.336 START TEST accel_crc32c_C2 00:06:46.336 ************************************ 00:06:46.336 10:38:02 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w crc32c -y -C 2 00:06:46.336 10:38:02 -- accel/accel.sh@16 -- # local accel_opc 00:06:46.336 10:38:02 -- accel/accel.sh@17 -- # local accel_module 00:06:46.336 10:38:02 -- accel/accel.sh@18 -- # accel_perf -t 1 -w crc32c -y -C 2 00:06:46.336 10:38:02 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -y -C 2 00:06:46.336 10:38:02 -- accel/accel.sh@12 -- # build_accel_config 00:06:46.336 10:38:02 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:46.336 10:38:02 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:46.336 10:38:02 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:46.336 10:38:02 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:46.336 10:38:02 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:46.336 10:38:02 -- accel/accel.sh@41 -- # local IFS=, 00:06:46.336 10:38:02 -- accel/accel.sh@42 -- # jq -r . 00:06:46.336 [2024-07-13 10:38:02.513530] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:46.336 [2024-07-13 10:38:02.513623] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1978691 ] 00:06:46.336 EAL: No free 2048 kB hugepages reported on node 1 00:06:46.336 [2024-07-13 10:38:02.584262] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:46.336 [2024-07-13 10:38:02.618803] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:47.712 10:38:03 -- accel/accel.sh@18 -- # out=' 00:06:47.712 SPDK Configuration: 00:06:47.712 Core mask: 0x1 00:06:47.712 00:06:47.712 Accel Perf Configuration: 00:06:47.712 Workload Type: crc32c 00:06:47.712 CRC-32C seed: 0 00:06:47.712 Transfer size: 4096 bytes 00:06:47.712 Vector count 2 00:06:47.712 Module: software 00:06:47.712 Queue depth: 32 00:06:47.712 Allocate depth: 32 00:06:47.712 # threads/core: 1 00:06:47.712 Run time: 1 seconds 00:06:47.712 Verify: Yes 00:06:47.712 00:06:47.712 Running for 1 seconds... 00:06:47.712 00:06:47.712 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:47.712 ------------------------------------------------------------------------------------ 00:06:47.712 0,0 617504/s 4824 MiB/s 0 0 00:06:47.712 ==================================================================================== 00:06:47.712 Total 617504/s 2412 MiB/s 0 0' 00:06:47.712 10:38:03 -- accel/accel.sh@20 -- # IFS=: 00:06:47.712 10:38:03 -- accel/accel.sh@20 -- # read -r var val 00:06:47.712 10:38:03 -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -y -C 2 00:06:47.712 10:38:03 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -y -C 2 00:06:47.712 10:38:03 -- accel/accel.sh@12 -- # build_accel_config 00:06:47.712 10:38:03 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:47.712 10:38:03 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:47.712 10:38:03 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:47.712 10:38:03 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:47.712 10:38:03 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:47.712 10:38:03 -- accel/accel.sh@41 -- # local IFS=, 00:06:47.712 10:38:03 -- accel/accel.sh@42 -- # jq -r . 00:06:47.712 [2024-07-13 10:38:03.798512] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:47.712 [2024-07-13 10:38:03.798604] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1978957 ] 00:06:47.712 EAL: No free 2048 kB hugepages reported on node 1 00:06:47.712 [2024-07-13 10:38:03.868405] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:47.712 [2024-07-13 10:38:03.902695] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:47.712 10:38:03 -- accel/accel.sh@21 -- # val= 00:06:47.712 10:38:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.712 10:38:03 -- accel/accel.sh@20 -- # IFS=: 00:06:47.712 10:38:03 -- accel/accel.sh@20 -- # read -r var val 00:06:47.712 10:38:03 -- accel/accel.sh@21 -- # val= 00:06:47.712 10:38:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.712 10:38:03 -- accel/accel.sh@20 -- # IFS=: 00:06:47.712 10:38:03 -- accel/accel.sh@20 -- # read -r var val 00:06:47.712 10:38:03 -- accel/accel.sh@21 -- # val=0x1 00:06:47.712 10:38:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.712 10:38:03 -- accel/accel.sh@20 -- # IFS=: 00:06:47.712 10:38:03 -- accel/accel.sh@20 -- # read -r var val 00:06:47.712 10:38:03 -- accel/accel.sh@21 -- # val= 00:06:47.712 10:38:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.712 10:38:03 -- accel/accel.sh@20 -- # IFS=: 00:06:47.712 10:38:03 -- accel/accel.sh@20 -- # read -r var val 00:06:47.712 10:38:03 -- accel/accel.sh@21 -- # val= 00:06:47.712 10:38:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.712 10:38:03 -- accel/accel.sh@20 -- # IFS=: 00:06:47.712 10:38:03 -- accel/accel.sh@20 -- # read -r var val 00:06:47.712 10:38:03 -- accel/accel.sh@21 -- # val=crc32c 00:06:47.712 10:38:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.712 10:38:03 -- accel/accel.sh@24 -- # accel_opc=crc32c 00:06:47.712 10:38:03 -- accel/accel.sh@20 -- # IFS=: 00:06:47.712 10:38:03 -- accel/accel.sh@20 -- # read -r var val 00:06:47.712 10:38:03 -- accel/accel.sh@21 -- # val=0 00:06:47.712 10:38:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.712 10:38:03 -- accel/accel.sh@20 -- # IFS=: 00:06:47.712 10:38:03 -- accel/accel.sh@20 -- # read -r var val 00:06:47.712 10:38:03 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:47.712 10:38:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.712 10:38:03 -- accel/accel.sh@20 -- # IFS=: 00:06:47.712 10:38:03 -- accel/accel.sh@20 -- # read -r var val 00:06:47.712 10:38:03 -- accel/accel.sh@21 -- # val= 00:06:47.712 10:38:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.712 10:38:03 -- accel/accel.sh@20 -- # IFS=: 00:06:47.712 10:38:03 -- accel/accel.sh@20 -- # read -r var val 00:06:47.712 10:38:03 -- accel/accel.sh@21 -- # val=software 00:06:47.712 10:38:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.712 10:38:03 -- accel/accel.sh@23 -- # accel_module=software 00:06:47.712 10:38:03 -- accel/accel.sh@20 -- # IFS=: 00:06:47.712 10:38:03 -- accel/accel.sh@20 -- # read -r var val 00:06:47.712 10:38:03 -- accel/accel.sh@21 -- # val=32 00:06:47.712 10:38:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.712 10:38:03 -- accel/accel.sh@20 -- # IFS=: 00:06:47.712 10:38:03 -- accel/accel.sh@20 -- # read -r var val 00:06:47.712 10:38:03 -- accel/accel.sh@21 -- # val=32 00:06:47.712 10:38:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.712 10:38:03 -- accel/accel.sh@20 -- # IFS=: 00:06:47.712 10:38:03 -- accel/accel.sh@20 -- # read -r var val 00:06:47.712 10:38:03 -- accel/accel.sh@21 -- # val=1 00:06:47.712 10:38:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.712 10:38:03 -- accel/accel.sh@20 -- # IFS=: 00:06:47.712 10:38:03 -- accel/accel.sh@20 -- # read -r var val 00:06:47.712 10:38:03 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:47.712 10:38:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.712 10:38:03 -- accel/accel.sh@20 -- # IFS=: 00:06:47.712 10:38:03 -- accel/accel.sh@20 -- # read -r var val 00:06:47.712 10:38:03 -- accel/accel.sh@21 -- # val=Yes 00:06:47.712 10:38:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.712 10:38:03 -- accel/accel.sh@20 -- # IFS=: 00:06:47.712 10:38:03 -- accel/accel.sh@20 -- # read -r var val 00:06:47.712 10:38:03 -- accel/accel.sh@21 -- # val= 00:06:47.712 10:38:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.712 10:38:03 -- accel/accel.sh@20 -- # IFS=: 00:06:47.712 10:38:03 -- accel/accel.sh@20 -- # read -r var val 00:06:47.712 10:38:03 -- accel/accel.sh@21 -- # val= 00:06:47.712 10:38:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.712 10:38:03 -- accel/accel.sh@20 -- # IFS=: 00:06:47.712 10:38:03 -- accel/accel.sh@20 -- # read -r var val 00:06:49.087 10:38:05 -- accel/accel.sh@21 -- # val= 00:06:49.087 10:38:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.087 10:38:05 -- accel/accel.sh@20 -- # IFS=: 00:06:49.087 10:38:05 -- accel/accel.sh@20 -- # read -r var val 00:06:49.087 10:38:05 -- accel/accel.sh@21 -- # val= 00:06:49.087 10:38:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.087 10:38:05 -- accel/accel.sh@20 -- # IFS=: 00:06:49.087 10:38:05 -- accel/accel.sh@20 -- # read -r var val 00:06:49.087 10:38:05 -- accel/accel.sh@21 -- # val= 00:06:49.087 10:38:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.087 10:38:05 -- accel/accel.sh@20 -- # IFS=: 00:06:49.087 10:38:05 -- accel/accel.sh@20 -- # read -r var val 00:06:49.087 10:38:05 -- accel/accel.sh@21 -- # val= 00:06:49.087 10:38:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.087 10:38:05 -- accel/accel.sh@20 -- # IFS=: 00:06:49.087 10:38:05 -- accel/accel.sh@20 -- # read -r var val 00:06:49.087 10:38:05 -- accel/accel.sh@21 -- # val= 00:06:49.087 10:38:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.087 10:38:05 -- accel/accel.sh@20 -- # IFS=: 00:06:49.087 10:38:05 -- accel/accel.sh@20 -- # read -r var val 00:06:49.087 10:38:05 -- accel/accel.sh@21 -- # val= 00:06:49.087 10:38:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.087 10:38:05 -- accel/accel.sh@20 -- # IFS=: 00:06:49.087 10:38:05 -- accel/accel.sh@20 -- # read -r var val 00:06:49.087 10:38:05 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:49.087 10:38:05 -- accel/accel.sh@28 -- # [[ -n crc32c ]] 00:06:49.087 10:38:05 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:49.087 00:06:49.087 real 0m2.575s 00:06:49.087 user 0m2.333s 00:06:49.087 sys 0m0.251s 00:06:49.087 10:38:05 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:49.087 10:38:05 -- common/autotest_common.sh@10 -- # set +x 00:06:49.087 ************************************ 00:06:49.087 END TEST accel_crc32c_C2 00:06:49.087 ************************************ 00:06:49.087 10:38:05 -- accel/accel.sh@95 -- # run_test accel_copy accel_test -t 1 -w copy -y 00:06:49.087 10:38:05 -- common/autotest_common.sh@1077 -- # '[' 7 -le 1 ']' 00:06:49.087 10:38:05 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:49.087 10:38:05 -- common/autotest_common.sh@10 -- # set +x 00:06:49.087 ************************************ 00:06:49.087 START TEST accel_copy 00:06:49.087 ************************************ 00:06:49.087 10:38:05 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w copy -y 00:06:49.087 10:38:05 -- accel/accel.sh@16 -- # local accel_opc 00:06:49.087 10:38:05 -- accel/accel.sh@17 -- # local accel_module 00:06:49.087 10:38:05 -- accel/accel.sh@18 -- # accel_perf -t 1 -w copy -y 00:06:49.087 10:38:05 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy -y 00:06:49.087 10:38:05 -- accel/accel.sh@12 -- # build_accel_config 00:06:49.087 10:38:05 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:49.087 10:38:05 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:49.087 10:38:05 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:49.087 10:38:05 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:49.087 10:38:05 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:49.087 10:38:05 -- accel/accel.sh@41 -- # local IFS=, 00:06:49.087 10:38:05 -- accel/accel.sh@42 -- # jq -r . 00:06:49.087 [2024-07-13 10:38:05.136494] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:49.087 [2024-07-13 10:38:05.136591] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1979244 ] 00:06:49.087 EAL: No free 2048 kB hugepages reported on node 1 00:06:49.087 [2024-07-13 10:38:05.205419] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:49.087 [2024-07-13 10:38:05.241740] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:50.022 10:38:06 -- accel/accel.sh@18 -- # out=' 00:06:50.022 SPDK Configuration: 00:06:50.022 Core mask: 0x1 00:06:50.022 00:06:50.022 Accel Perf Configuration: 00:06:50.022 Workload Type: copy 00:06:50.022 Transfer size: 4096 bytes 00:06:50.022 Vector count 1 00:06:50.023 Module: software 00:06:50.023 Queue depth: 32 00:06:50.023 Allocate depth: 32 00:06:50.023 # threads/core: 1 00:06:50.023 Run time: 1 seconds 00:06:50.023 Verify: Yes 00:06:50.023 00:06:50.023 Running for 1 seconds... 00:06:50.023 00:06:50.023 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:50.023 ------------------------------------------------------------------------------------ 00:06:50.023 0,0 548480/s 2142 MiB/s 0 0 00:06:50.023 ==================================================================================== 00:06:50.023 Total 548480/s 2142 MiB/s 0 0' 00:06:50.023 10:38:06 -- accel/accel.sh@20 -- # IFS=: 00:06:50.023 10:38:06 -- accel/accel.sh@20 -- # read -r var val 00:06:50.023 10:38:06 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy -y 00:06:50.023 10:38:06 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy -y 00:06:50.023 10:38:06 -- accel/accel.sh@12 -- # build_accel_config 00:06:50.023 10:38:06 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:50.023 10:38:06 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:50.023 10:38:06 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:50.023 10:38:06 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:50.023 10:38:06 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:50.023 10:38:06 -- accel/accel.sh@41 -- # local IFS=, 00:06:50.023 10:38:06 -- accel/accel.sh@42 -- # jq -r . 00:06:50.282 [2024-07-13 10:38:06.422592] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:50.282 [2024-07-13 10:38:06.422701] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1979493 ] 00:06:50.282 EAL: No free 2048 kB hugepages reported on node 1 00:06:50.282 [2024-07-13 10:38:06.490719] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:50.282 [2024-07-13 10:38:06.524709] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:50.282 10:38:06 -- accel/accel.sh@21 -- # val= 00:06:50.282 10:38:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.282 10:38:06 -- accel/accel.sh@20 -- # IFS=: 00:06:50.282 10:38:06 -- accel/accel.sh@20 -- # read -r var val 00:06:50.282 10:38:06 -- accel/accel.sh@21 -- # val= 00:06:50.282 10:38:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.282 10:38:06 -- accel/accel.sh@20 -- # IFS=: 00:06:50.282 10:38:06 -- accel/accel.sh@20 -- # read -r var val 00:06:50.282 10:38:06 -- accel/accel.sh@21 -- # val=0x1 00:06:50.282 10:38:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.282 10:38:06 -- accel/accel.sh@20 -- # IFS=: 00:06:50.282 10:38:06 -- accel/accel.sh@20 -- # read -r var val 00:06:50.282 10:38:06 -- accel/accel.sh@21 -- # val= 00:06:50.282 10:38:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.282 10:38:06 -- accel/accel.sh@20 -- # IFS=: 00:06:50.282 10:38:06 -- accel/accel.sh@20 -- # read -r var val 00:06:50.282 10:38:06 -- accel/accel.sh@21 -- # val= 00:06:50.282 10:38:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.282 10:38:06 -- accel/accel.sh@20 -- # IFS=: 00:06:50.282 10:38:06 -- accel/accel.sh@20 -- # read -r var val 00:06:50.282 10:38:06 -- accel/accel.sh@21 -- # val=copy 00:06:50.282 10:38:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.282 10:38:06 -- accel/accel.sh@24 -- # accel_opc=copy 00:06:50.282 10:38:06 -- accel/accel.sh@20 -- # IFS=: 00:06:50.282 10:38:06 -- accel/accel.sh@20 -- # read -r var val 00:06:50.282 10:38:06 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:50.282 10:38:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.282 10:38:06 -- accel/accel.sh@20 -- # IFS=: 00:06:50.282 10:38:06 -- accel/accel.sh@20 -- # read -r var val 00:06:50.282 10:38:06 -- accel/accel.sh@21 -- # val= 00:06:50.282 10:38:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.282 10:38:06 -- accel/accel.sh@20 -- # IFS=: 00:06:50.282 10:38:06 -- accel/accel.sh@20 -- # read -r var val 00:06:50.282 10:38:06 -- accel/accel.sh@21 -- # val=software 00:06:50.282 10:38:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.282 10:38:06 -- accel/accel.sh@23 -- # accel_module=software 00:06:50.282 10:38:06 -- accel/accel.sh@20 -- # IFS=: 00:06:50.282 10:38:06 -- accel/accel.sh@20 -- # read -r var val 00:06:50.282 10:38:06 -- accel/accel.sh@21 -- # val=32 00:06:50.283 10:38:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.283 10:38:06 -- accel/accel.sh@20 -- # IFS=: 00:06:50.283 10:38:06 -- accel/accel.sh@20 -- # read -r var val 00:06:50.283 10:38:06 -- accel/accel.sh@21 -- # val=32 00:06:50.283 10:38:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.283 10:38:06 -- accel/accel.sh@20 -- # IFS=: 00:06:50.283 10:38:06 -- accel/accel.sh@20 -- # read -r var val 00:06:50.283 10:38:06 -- accel/accel.sh@21 -- # val=1 00:06:50.283 10:38:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.283 10:38:06 -- accel/accel.sh@20 -- # IFS=: 00:06:50.283 10:38:06 -- accel/accel.sh@20 -- # read -r var val 00:06:50.283 10:38:06 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:50.283 10:38:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.283 10:38:06 -- accel/accel.sh@20 -- # IFS=: 00:06:50.283 10:38:06 -- accel/accel.sh@20 -- # read -r var val 00:06:50.283 10:38:06 -- accel/accel.sh@21 -- # val=Yes 00:06:50.283 10:38:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.283 10:38:06 -- accel/accel.sh@20 -- # IFS=: 00:06:50.283 10:38:06 -- accel/accel.sh@20 -- # read -r var val 00:06:50.283 10:38:06 -- accel/accel.sh@21 -- # val= 00:06:50.283 10:38:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.283 10:38:06 -- accel/accel.sh@20 -- # IFS=: 00:06:50.283 10:38:06 -- accel/accel.sh@20 -- # read -r var val 00:06:50.283 10:38:06 -- accel/accel.sh@21 -- # val= 00:06:50.283 10:38:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.283 10:38:06 -- accel/accel.sh@20 -- # IFS=: 00:06:50.283 10:38:06 -- accel/accel.sh@20 -- # read -r var val 00:06:51.695 10:38:07 -- accel/accel.sh@21 -- # val= 00:06:51.695 10:38:07 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.695 10:38:07 -- accel/accel.sh@20 -- # IFS=: 00:06:51.695 10:38:07 -- accel/accel.sh@20 -- # read -r var val 00:06:51.695 10:38:07 -- accel/accel.sh@21 -- # val= 00:06:51.695 10:38:07 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.695 10:38:07 -- accel/accel.sh@20 -- # IFS=: 00:06:51.695 10:38:07 -- accel/accel.sh@20 -- # read -r var val 00:06:51.695 10:38:07 -- accel/accel.sh@21 -- # val= 00:06:51.695 10:38:07 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.695 10:38:07 -- accel/accel.sh@20 -- # IFS=: 00:06:51.695 10:38:07 -- accel/accel.sh@20 -- # read -r var val 00:06:51.695 10:38:07 -- accel/accel.sh@21 -- # val= 00:06:51.695 10:38:07 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.695 10:38:07 -- accel/accel.sh@20 -- # IFS=: 00:06:51.695 10:38:07 -- accel/accel.sh@20 -- # read -r var val 00:06:51.695 10:38:07 -- accel/accel.sh@21 -- # val= 00:06:51.695 10:38:07 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.695 10:38:07 -- accel/accel.sh@20 -- # IFS=: 00:06:51.695 10:38:07 -- accel/accel.sh@20 -- # read -r var val 00:06:51.695 10:38:07 -- accel/accel.sh@21 -- # val= 00:06:51.695 10:38:07 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.695 10:38:07 -- accel/accel.sh@20 -- # IFS=: 00:06:51.695 10:38:07 -- accel/accel.sh@20 -- # read -r var val 00:06:51.695 10:38:07 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:51.695 10:38:07 -- accel/accel.sh@28 -- # [[ -n copy ]] 00:06:51.695 10:38:07 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:51.695 00:06:51.695 real 0m2.574s 00:06:51.695 user 0m2.327s 00:06:51.695 sys 0m0.254s 00:06:51.695 10:38:07 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:51.695 10:38:07 -- common/autotest_common.sh@10 -- # set +x 00:06:51.695 ************************************ 00:06:51.695 END TEST accel_copy 00:06:51.695 ************************************ 00:06:51.695 10:38:07 -- accel/accel.sh@96 -- # run_test accel_fill accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:51.695 10:38:07 -- common/autotest_common.sh@1077 -- # '[' 13 -le 1 ']' 00:06:51.695 10:38:07 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:51.695 10:38:07 -- common/autotest_common.sh@10 -- # set +x 00:06:51.695 ************************************ 00:06:51.695 START TEST accel_fill 00:06:51.695 ************************************ 00:06:51.695 10:38:07 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:51.695 10:38:07 -- accel/accel.sh@16 -- # local accel_opc 00:06:51.695 10:38:07 -- accel/accel.sh@17 -- # local accel_module 00:06:51.695 10:38:07 -- accel/accel.sh@18 -- # accel_perf -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:51.695 10:38:07 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:51.696 10:38:07 -- accel/accel.sh@12 -- # build_accel_config 00:06:51.696 10:38:07 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:51.696 10:38:07 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:51.696 10:38:07 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:51.696 10:38:07 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:51.696 10:38:07 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:51.696 10:38:07 -- accel/accel.sh@41 -- # local IFS=, 00:06:51.696 10:38:07 -- accel/accel.sh@42 -- # jq -r . 00:06:51.696 [2024-07-13 10:38:07.758223] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:51.696 [2024-07-13 10:38:07.758306] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1979687 ] 00:06:51.696 EAL: No free 2048 kB hugepages reported on node 1 00:06:51.696 [2024-07-13 10:38:07.827889] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:51.696 [2024-07-13 10:38:07.864411] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:53.075 10:38:09 -- accel/accel.sh@18 -- # out=' 00:06:53.075 SPDK Configuration: 00:06:53.075 Core mask: 0x1 00:06:53.075 00:06:53.075 Accel Perf Configuration: 00:06:53.075 Workload Type: fill 00:06:53.075 Fill pattern: 0x80 00:06:53.075 Transfer size: 4096 bytes 00:06:53.075 Vector count 1 00:06:53.075 Module: software 00:06:53.075 Queue depth: 64 00:06:53.075 Allocate depth: 64 00:06:53.075 # threads/core: 1 00:06:53.075 Run time: 1 seconds 00:06:53.075 Verify: Yes 00:06:53.075 00:06:53.075 Running for 1 seconds... 00:06:53.075 00:06:53.075 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:53.075 ------------------------------------------------------------------------------------ 00:06:53.075 0,0 946752/s 3698 MiB/s 0 0 00:06:53.075 ==================================================================================== 00:06:53.075 Total 946752/s 3698 MiB/s 0 0' 00:06:53.075 10:38:09 -- accel/accel.sh@20 -- # IFS=: 00:06:53.075 10:38:09 -- accel/accel.sh@20 -- # read -r var val 00:06:53.075 10:38:09 -- accel/accel.sh@15 -- # accel_perf -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:53.075 10:38:09 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:53.075 10:38:09 -- accel/accel.sh@12 -- # build_accel_config 00:06:53.075 10:38:09 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:53.075 10:38:09 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:53.075 10:38:09 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:53.075 10:38:09 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:53.075 10:38:09 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:53.075 10:38:09 -- accel/accel.sh@41 -- # local IFS=, 00:06:53.075 10:38:09 -- accel/accel.sh@42 -- # jq -r . 00:06:53.075 [2024-07-13 10:38:09.045232] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:53.075 [2024-07-13 10:38:09.045322] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1979848 ] 00:06:53.075 EAL: No free 2048 kB hugepages reported on node 1 00:06:53.075 [2024-07-13 10:38:09.116577] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:53.075 [2024-07-13 10:38:09.152059] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:53.075 10:38:09 -- accel/accel.sh@21 -- # val= 00:06:53.075 10:38:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.075 10:38:09 -- accel/accel.sh@20 -- # IFS=: 00:06:53.075 10:38:09 -- accel/accel.sh@20 -- # read -r var val 00:06:53.075 10:38:09 -- accel/accel.sh@21 -- # val= 00:06:53.075 10:38:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.075 10:38:09 -- accel/accel.sh@20 -- # IFS=: 00:06:53.075 10:38:09 -- accel/accel.sh@20 -- # read -r var val 00:06:53.075 10:38:09 -- accel/accel.sh@21 -- # val=0x1 00:06:53.075 10:38:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.075 10:38:09 -- accel/accel.sh@20 -- # IFS=: 00:06:53.075 10:38:09 -- accel/accel.sh@20 -- # read -r var val 00:06:53.075 10:38:09 -- accel/accel.sh@21 -- # val= 00:06:53.075 10:38:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.075 10:38:09 -- accel/accel.sh@20 -- # IFS=: 00:06:53.075 10:38:09 -- accel/accel.sh@20 -- # read -r var val 00:06:53.075 10:38:09 -- accel/accel.sh@21 -- # val= 00:06:53.075 10:38:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.075 10:38:09 -- accel/accel.sh@20 -- # IFS=: 00:06:53.076 10:38:09 -- accel/accel.sh@20 -- # read -r var val 00:06:53.076 10:38:09 -- accel/accel.sh@21 -- # val=fill 00:06:53.076 10:38:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.076 10:38:09 -- accel/accel.sh@24 -- # accel_opc=fill 00:06:53.076 10:38:09 -- accel/accel.sh@20 -- # IFS=: 00:06:53.076 10:38:09 -- accel/accel.sh@20 -- # read -r var val 00:06:53.076 10:38:09 -- accel/accel.sh@21 -- # val=0x80 00:06:53.076 10:38:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.076 10:38:09 -- accel/accel.sh@20 -- # IFS=: 00:06:53.076 10:38:09 -- accel/accel.sh@20 -- # read -r var val 00:06:53.076 10:38:09 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:53.076 10:38:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.076 10:38:09 -- accel/accel.sh@20 -- # IFS=: 00:06:53.076 10:38:09 -- accel/accel.sh@20 -- # read -r var val 00:06:53.076 10:38:09 -- accel/accel.sh@21 -- # val= 00:06:53.076 10:38:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.076 10:38:09 -- accel/accel.sh@20 -- # IFS=: 00:06:53.076 10:38:09 -- accel/accel.sh@20 -- # read -r var val 00:06:53.076 10:38:09 -- accel/accel.sh@21 -- # val=software 00:06:53.076 10:38:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.076 10:38:09 -- accel/accel.sh@23 -- # accel_module=software 00:06:53.076 10:38:09 -- accel/accel.sh@20 -- # IFS=: 00:06:53.076 10:38:09 -- accel/accel.sh@20 -- # read -r var val 00:06:53.076 10:38:09 -- accel/accel.sh@21 -- # val=64 00:06:53.076 10:38:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.076 10:38:09 -- accel/accel.sh@20 -- # IFS=: 00:06:53.076 10:38:09 -- accel/accel.sh@20 -- # read -r var val 00:06:53.076 10:38:09 -- accel/accel.sh@21 -- # val=64 00:06:53.076 10:38:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.076 10:38:09 -- accel/accel.sh@20 -- # IFS=: 00:06:53.076 10:38:09 -- accel/accel.sh@20 -- # read -r var val 00:06:53.076 10:38:09 -- accel/accel.sh@21 -- # val=1 00:06:53.076 10:38:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.076 10:38:09 -- accel/accel.sh@20 -- # IFS=: 00:06:53.076 10:38:09 -- accel/accel.sh@20 -- # read -r var val 00:06:53.076 10:38:09 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:53.076 10:38:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.076 10:38:09 -- accel/accel.sh@20 -- # IFS=: 00:06:53.076 10:38:09 -- accel/accel.sh@20 -- # read -r var val 00:06:53.076 10:38:09 -- accel/accel.sh@21 -- # val=Yes 00:06:53.076 10:38:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.076 10:38:09 -- accel/accel.sh@20 -- # IFS=: 00:06:53.076 10:38:09 -- accel/accel.sh@20 -- # read -r var val 00:06:53.076 10:38:09 -- accel/accel.sh@21 -- # val= 00:06:53.076 10:38:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.076 10:38:09 -- accel/accel.sh@20 -- # IFS=: 00:06:53.076 10:38:09 -- accel/accel.sh@20 -- # read -r var val 00:06:53.076 10:38:09 -- accel/accel.sh@21 -- # val= 00:06:53.076 10:38:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.076 10:38:09 -- accel/accel.sh@20 -- # IFS=: 00:06:53.076 10:38:09 -- accel/accel.sh@20 -- # read -r var val 00:06:54.013 10:38:10 -- accel/accel.sh@21 -- # val= 00:06:54.013 10:38:10 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.013 10:38:10 -- accel/accel.sh@20 -- # IFS=: 00:06:54.013 10:38:10 -- accel/accel.sh@20 -- # read -r var val 00:06:54.013 10:38:10 -- accel/accel.sh@21 -- # val= 00:06:54.013 10:38:10 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.013 10:38:10 -- accel/accel.sh@20 -- # IFS=: 00:06:54.013 10:38:10 -- accel/accel.sh@20 -- # read -r var val 00:06:54.013 10:38:10 -- accel/accel.sh@21 -- # val= 00:06:54.013 10:38:10 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.013 10:38:10 -- accel/accel.sh@20 -- # IFS=: 00:06:54.013 10:38:10 -- accel/accel.sh@20 -- # read -r var val 00:06:54.013 10:38:10 -- accel/accel.sh@21 -- # val= 00:06:54.013 10:38:10 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.013 10:38:10 -- accel/accel.sh@20 -- # IFS=: 00:06:54.013 10:38:10 -- accel/accel.sh@20 -- # read -r var val 00:06:54.013 10:38:10 -- accel/accel.sh@21 -- # val= 00:06:54.013 10:38:10 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.013 10:38:10 -- accel/accel.sh@20 -- # IFS=: 00:06:54.013 10:38:10 -- accel/accel.sh@20 -- # read -r var val 00:06:54.014 10:38:10 -- accel/accel.sh@21 -- # val= 00:06:54.014 10:38:10 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.014 10:38:10 -- accel/accel.sh@20 -- # IFS=: 00:06:54.014 10:38:10 -- accel/accel.sh@20 -- # read -r var val 00:06:54.014 10:38:10 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:54.014 10:38:10 -- accel/accel.sh@28 -- # [[ -n fill ]] 00:06:54.014 10:38:10 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:54.014 00:06:54.014 real 0m2.580s 00:06:54.014 user 0m2.330s 00:06:54.014 sys 0m0.259s 00:06:54.014 10:38:10 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:54.014 10:38:10 -- common/autotest_common.sh@10 -- # set +x 00:06:54.014 ************************************ 00:06:54.014 END TEST accel_fill 00:06:54.014 ************************************ 00:06:54.014 10:38:10 -- accel/accel.sh@97 -- # run_test accel_copy_crc32c accel_test -t 1 -w copy_crc32c -y 00:06:54.014 10:38:10 -- common/autotest_common.sh@1077 -- # '[' 7 -le 1 ']' 00:06:54.014 10:38:10 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:54.014 10:38:10 -- common/autotest_common.sh@10 -- # set +x 00:06:54.014 ************************************ 00:06:54.014 START TEST accel_copy_crc32c 00:06:54.014 ************************************ 00:06:54.014 10:38:10 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w copy_crc32c -y 00:06:54.014 10:38:10 -- accel/accel.sh@16 -- # local accel_opc 00:06:54.014 10:38:10 -- accel/accel.sh@17 -- # local accel_module 00:06:54.014 10:38:10 -- accel/accel.sh@18 -- # accel_perf -t 1 -w copy_crc32c -y 00:06:54.014 10:38:10 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y 00:06:54.014 10:38:10 -- accel/accel.sh@12 -- # build_accel_config 00:06:54.014 10:38:10 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:54.014 10:38:10 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:54.014 10:38:10 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:54.014 10:38:10 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:54.014 10:38:10 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:54.014 10:38:10 -- accel/accel.sh@41 -- # local IFS=, 00:06:54.014 10:38:10 -- accel/accel.sh@42 -- # jq -r . 00:06:54.014 [2024-07-13 10:38:10.387340] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:54.014 [2024-07-13 10:38:10.387425] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1980115 ] 00:06:54.273 EAL: No free 2048 kB hugepages reported on node 1 00:06:54.274 [2024-07-13 10:38:10.454770] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:54.274 [2024-07-13 10:38:10.490288] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:55.653 10:38:11 -- accel/accel.sh@18 -- # out=' 00:06:55.653 SPDK Configuration: 00:06:55.653 Core mask: 0x1 00:06:55.653 00:06:55.653 Accel Perf Configuration: 00:06:55.653 Workload Type: copy_crc32c 00:06:55.653 CRC-32C seed: 0 00:06:55.653 Vector size: 4096 bytes 00:06:55.653 Transfer size: 4096 bytes 00:06:55.653 Vector count 1 00:06:55.653 Module: software 00:06:55.653 Queue depth: 32 00:06:55.653 Allocate depth: 32 00:06:55.653 # threads/core: 1 00:06:55.653 Run time: 1 seconds 00:06:55.654 Verify: Yes 00:06:55.654 00:06:55.654 Running for 1 seconds... 00:06:55.654 00:06:55.654 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:55.654 ------------------------------------------------------------------------------------ 00:06:55.654 0,0 414752/s 1620 MiB/s 0 0 00:06:55.654 ==================================================================================== 00:06:55.654 Total 414752/s 1620 MiB/s 0 0' 00:06:55.654 10:38:11 -- accel/accel.sh@20 -- # IFS=: 00:06:55.654 10:38:11 -- accel/accel.sh@20 -- # read -r var val 00:06:55.654 10:38:11 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y 00:06:55.654 10:38:11 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y 00:06:55.654 10:38:11 -- accel/accel.sh@12 -- # build_accel_config 00:06:55.654 10:38:11 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:55.654 10:38:11 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:55.654 10:38:11 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:55.654 10:38:11 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:55.654 10:38:11 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:55.654 10:38:11 -- accel/accel.sh@41 -- # local IFS=, 00:06:55.654 10:38:11 -- accel/accel.sh@42 -- # jq -r . 00:06:55.654 [2024-07-13 10:38:11.669206] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:55.654 [2024-07-13 10:38:11.669315] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1980384 ] 00:06:55.654 EAL: No free 2048 kB hugepages reported on node 1 00:06:55.654 [2024-07-13 10:38:11.737467] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:55.654 [2024-07-13 10:38:11.771462] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:55.654 10:38:11 -- accel/accel.sh@21 -- # val= 00:06:55.654 10:38:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.654 10:38:11 -- accel/accel.sh@20 -- # IFS=: 00:06:55.654 10:38:11 -- accel/accel.sh@20 -- # read -r var val 00:06:55.654 10:38:11 -- accel/accel.sh@21 -- # val= 00:06:55.654 10:38:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.654 10:38:11 -- accel/accel.sh@20 -- # IFS=: 00:06:55.654 10:38:11 -- accel/accel.sh@20 -- # read -r var val 00:06:55.654 10:38:11 -- accel/accel.sh@21 -- # val=0x1 00:06:55.654 10:38:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.654 10:38:11 -- accel/accel.sh@20 -- # IFS=: 00:06:55.654 10:38:11 -- accel/accel.sh@20 -- # read -r var val 00:06:55.654 10:38:11 -- accel/accel.sh@21 -- # val= 00:06:55.654 10:38:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.654 10:38:11 -- accel/accel.sh@20 -- # IFS=: 00:06:55.654 10:38:11 -- accel/accel.sh@20 -- # read -r var val 00:06:55.654 10:38:11 -- accel/accel.sh@21 -- # val= 00:06:55.654 10:38:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.654 10:38:11 -- accel/accel.sh@20 -- # IFS=: 00:06:55.654 10:38:11 -- accel/accel.sh@20 -- # read -r var val 00:06:55.654 10:38:11 -- accel/accel.sh@21 -- # val=copy_crc32c 00:06:55.654 10:38:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.654 10:38:11 -- accel/accel.sh@24 -- # accel_opc=copy_crc32c 00:06:55.654 10:38:11 -- accel/accel.sh@20 -- # IFS=: 00:06:55.654 10:38:11 -- accel/accel.sh@20 -- # read -r var val 00:06:55.654 10:38:11 -- accel/accel.sh@21 -- # val=0 00:06:55.654 10:38:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.654 10:38:11 -- accel/accel.sh@20 -- # IFS=: 00:06:55.654 10:38:11 -- accel/accel.sh@20 -- # read -r var val 00:06:55.654 10:38:11 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:55.654 10:38:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.654 10:38:11 -- accel/accel.sh@20 -- # IFS=: 00:06:55.654 10:38:11 -- accel/accel.sh@20 -- # read -r var val 00:06:55.654 10:38:11 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:55.654 10:38:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.654 10:38:11 -- accel/accel.sh@20 -- # IFS=: 00:06:55.654 10:38:11 -- accel/accel.sh@20 -- # read -r var val 00:06:55.654 10:38:11 -- accel/accel.sh@21 -- # val= 00:06:55.654 10:38:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.654 10:38:11 -- accel/accel.sh@20 -- # IFS=: 00:06:55.654 10:38:11 -- accel/accel.sh@20 -- # read -r var val 00:06:55.654 10:38:11 -- accel/accel.sh@21 -- # val=software 00:06:55.654 10:38:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.654 10:38:11 -- accel/accel.sh@23 -- # accel_module=software 00:06:55.654 10:38:11 -- accel/accel.sh@20 -- # IFS=: 00:06:55.654 10:38:11 -- accel/accel.sh@20 -- # read -r var val 00:06:55.654 10:38:11 -- accel/accel.sh@21 -- # val=32 00:06:55.654 10:38:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.654 10:38:11 -- accel/accel.sh@20 -- # IFS=: 00:06:55.654 10:38:11 -- accel/accel.sh@20 -- # read -r var val 00:06:55.654 10:38:11 -- accel/accel.sh@21 -- # val=32 00:06:55.654 10:38:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.654 10:38:11 -- accel/accel.sh@20 -- # IFS=: 00:06:55.654 10:38:11 -- accel/accel.sh@20 -- # read -r var val 00:06:55.654 10:38:11 -- accel/accel.sh@21 -- # val=1 00:06:55.654 10:38:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.654 10:38:11 -- accel/accel.sh@20 -- # IFS=: 00:06:55.654 10:38:11 -- accel/accel.sh@20 -- # read -r var val 00:06:55.654 10:38:11 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:55.654 10:38:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.654 10:38:11 -- accel/accel.sh@20 -- # IFS=: 00:06:55.654 10:38:11 -- accel/accel.sh@20 -- # read -r var val 00:06:55.654 10:38:11 -- accel/accel.sh@21 -- # val=Yes 00:06:55.654 10:38:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.654 10:38:11 -- accel/accel.sh@20 -- # IFS=: 00:06:55.654 10:38:11 -- accel/accel.sh@20 -- # read -r var val 00:06:55.654 10:38:11 -- accel/accel.sh@21 -- # val= 00:06:55.654 10:38:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.654 10:38:11 -- accel/accel.sh@20 -- # IFS=: 00:06:55.654 10:38:11 -- accel/accel.sh@20 -- # read -r var val 00:06:55.654 10:38:11 -- accel/accel.sh@21 -- # val= 00:06:55.654 10:38:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.654 10:38:11 -- accel/accel.sh@20 -- # IFS=: 00:06:55.654 10:38:11 -- accel/accel.sh@20 -- # read -r var val 00:06:56.592 10:38:12 -- accel/accel.sh@21 -- # val= 00:06:56.592 10:38:12 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.592 10:38:12 -- accel/accel.sh@20 -- # IFS=: 00:06:56.592 10:38:12 -- accel/accel.sh@20 -- # read -r var val 00:06:56.592 10:38:12 -- accel/accel.sh@21 -- # val= 00:06:56.592 10:38:12 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.592 10:38:12 -- accel/accel.sh@20 -- # IFS=: 00:06:56.592 10:38:12 -- accel/accel.sh@20 -- # read -r var val 00:06:56.592 10:38:12 -- accel/accel.sh@21 -- # val= 00:06:56.592 10:38:12 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.592 10:38:12 -- accel/accel.sh@20 -- # IFS=: 00:06:56.592 10:38:12 -- accel/accel.sh@20 -- # read -r var val 00:06:56.592 10:38:12 -- accel/accel.sh@21 -- # val= 00:06:56.592 10:38:12 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.592 10:38:12 -- accel/accel.sh@20 -- # IFS=: 00:06:56.592 10:38:12 -- accel/accel.sh@20 -- # read -r var val 00:06:56.592 10:38:12 -- accel/accel.sh@21 -- # val= 00:06:56.592 10:38:12 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.592 10:38:12 -- accel/accel.sh@20 -- # IFS=: 00:06:56.592 10:38:12 -- accel/accel.sh@20 -- # read -r var val 00:06:56.592 10:38:12 -- accel/accel.sh@21 -- # val= 00:06:56.592 10:38:12 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.592 10:38:12 -- accel/accel.sh@20 -- # IFS=: 00:06:56.592 10:38:12 -- accel/accel.sh@20 -- # read -r var val 00:06:56.592 10:38:12 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:56.592 10:38:12 -- accel/accel.sh@28 -- # [[ -n copy_crc32c ]] 00:06:56.592 10:38:12 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:56.592 00:06:56.593 real 0m2.569s 00:06:56.593 user 0m2.321s 00:06:56.593 sys 0m0.257s 00:06:56.593 10:38:12 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:56.593 10:38:12 -- common/autotest_common.sh@10 -- # set +x 00:06:56.593 ************************************ 00:06:56.593 END TEST accel_copy_crc32c 00:06:56.593 ************************************ 00:06:56.593 10:38:12 -- accel/accel.sh@98 -- # run_test accel_copy_crc32c_C2 accel_test -t 1 -w copy_crc32c -y -C 2 00:06:56.593 10:38:12 -- common/autotest_common.sh@1077 -- # '[' 9 -le 1 ']' 00:06:56.593 10:38:12 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:56.593 10:38:12 -- common/autotest_common.sh@10 -- # set +x 00:06:56.851 ************************************ 00:06:56.851 START TEST accel_copy_crc32c_C2 00:06:56.851 ************************************ 00:06:56.851 10:38:12 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w copy_crc32c -y -C 2 00:06:56.851 10:38:12 -- accel/accel.sh@16 -- # local accel_opc 00:06:56.851 10:38:12 -- accel/accel.sh@17 -- # local accel_module 00:06:56.851 10:38:12 -- accel/accel.sh@18 -- # accel_perf -t 1 -w copy_crc32c -y -C 2 00:06:56.851 10:38:12 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y -C 2 00:06:56.851 10:38:12 -- accel/accel.sh@12 -- # build_accel_config 00:06:56.851 10:38:12 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:56.851 10:38:12 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:56.851 10:38:12 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:56.851 10:38:12 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:56.851 10:38:12 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:56.851 10:38:12 -- accel/accel.sh@41 -- # local IFS=, 00:06:56.851 10:38:12 -- accel/accel.sh@42 -- # jq -r . 00:06:56.851 [2024-07-13 10:38:13.004908] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:56.851 [2024-07-13 10:38:13.005015] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1980666 ] 00:06:56.851 EAL: No free 2048 kB hugepages reported on node 1 00:06:56.851 [2024-07-13 10:38:13.073715] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:56.851 [2024-07-13 10:38:13.109154] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:58.227 10:38:14 -- accel/accel.sh@18 -- # out=' 00:06:58.227 SPDK Configuration: 00:06:58.227 Core mask: 0x1 00:06:58.227 00:06:58.227 Accel Perf Configuration: 00:06:58.227 Workload Type: copy_crc32c 00:06:58.227 CRC-32C seed: 0 00:06:58.227 Vector size: 4096 bytes 00:06:58.227 Transfer size: 8192 bytes 00:06:58.227 Vector count 2 00:06:58.227 Module: software 00:06:58.227 Queue depth: 32 00:06:58.227 Allocate depth: 32 00:06:58.227 # threads/core: 1 00:06:58.227 Run time: 1 seconds 00:06:58.227 Verify: Yes 00:06:58.227 00:06:58.227 Running for 1 seconds... 00:06:58.227 00:06:58.227 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:58.227 ------------------------------------------------------------------------------------ 00:06:58.227 0,0 293440/s 2292 MiB/s 0 0 00:06:58.227 ==================================================================================== 00:06:58.227 Total 293440/s 1146 MiB/s 0 0' 00:06:58.227 10:38:14 -- accel/accel.sh@20 -- # IFS=: 00:06:58.227 10:38:14 -- accel/accel.sh@20 -- # read -r var val 00:06:58.227 10:38:14 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y -C 2 00:06:58.227 10:38:14 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y -C 2 00:06:58.227 10:38:14 -- accel/accel.sh@12 -- # build_accel_config 00:06:58.227 10:38:14 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:58.227 10:38:14 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:58.227 10:38:14 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:58.227 10:38:14 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:58.227 10:38:14 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:58.227 10:38:14 -- accel/accel.sh@41 -- # local IFS=, 00:06:58.227 10:38:14 -- accel/accel.sh@42 -- # jq -r . 00:06:58.227 [2024-07-13 10:38:14.290184] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:58.227 [2024-07-13 10:38:14.290270] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1980940 ] 00:06:58.227 EAL: No free 2048 kB hugepages reported on node 1 00:06:58.227 [2024-07-13 10:38:14.358268] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:58.227 [2024-07-13 10:38:14.393202] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:58.227 10:38:14 -- accel/accel.sh@21 -- # val= 00:06:58.227 10:38:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.227 10:38:14 -- accel/accel.sh@20 -- # IFS=: 00:06:58.227 10:38:14 -- accel/accel.sh@20 -- # read -r var val 00:06:58.227 10:38:14 -- accel/accel.sh@21 -- # val= 00:06:58.227 10:38:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.227 10:38:14 -- accel/accel.sh@20 -- # IFS=: 00:06:58.227 10:38:14 -- accel/accel.sh@20 -- # read -r var val 00:06:58.227 10:38:14 -- accel/accel.sh@21 -- # val=0x1 00:06:58.227 10:38:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.227 10:38:14 -- accel/accel.sh@20 -- # IFS=: 00:06:58.227 10:38:14 -- accel/accel.sh@20 -- # read -r var val 00:06:58.227 10:38:14 -- accel/accel.sh@21 -- # val= 00:06:58.227 10:38:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.227 10:38:14 -- accel/accel.sh@20 -- # IFS=: 00:06:58.227 10:38:14 -- accel/accel.sh@20 -- # read -r var val 00:06:58.227 10:38:14 -- accel/accel.sh@21 -- # val= 00:06:58.227 10:38:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.227 10:38:14 -- accel/accel.sh@20 -- # IFS=: 00:06:58.227 10:38:14 -- accel/accel.sh@20 -- # read -r var val 00:06:58.227 10:38:14 -- accel/accel.sh@21 -- # val=copy_crc32c 00:06:58.227 10:38:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.227 10:38:14 -- accel/accel.sh@24 -- # accel_opc=copy_crc32c 00:06:58.227 10:38:14 -- accel/accel.sh@20 -- # IFS=: 00:06:58.227 10:38:14 -- accel/accel.sh@20 -- # read -r var val 00:06:58.227 10:38:14 -- accel/accel.sh@21 -- # val=0 00:06:58.227 10:38:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.227 10:38:14 -- accel/accel.sh@20 -- # IFS=: 00:06:58.227 10:38:14 -- accel/accel.sh@20 -- # read -r var val 00:06:58.227 10:38:14 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:58.227 10:38:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.227 10:38:14 -- accel/accel.sh@20 -- # IFS=: 00:06:58.227 10:38:14 -- accel/accel.sh@20 -- # read -r var val 00:06:58.227 10:38:14 -- accel/accel.sh@21 -- # val='8192 bytes' 00:06:58.227 10:38:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.227 10:38:14 -- accel/accel.sh@20 -- # IFS=: 00:06:58.227 10:38:14 -- accel/accel.sh@20 -- # read -r var val 00:06:58.227 10:38:14 -- accel/accel.sh@21 -- # val= 00:06:58.227 10:38:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.227 10:38:14 -- accel/accel.sh@20 -- # IFS=: 00:06:58.227 10:38:14 -- accel/accel.sh@20 -- # read -r var val 00:06:58.227 10:38:14 -- accel/accel.sh@21 -- # val=software 00:06:58.227 10:38:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.227 10:38:14 -- accel/accel.sh@23 -- # accel_module=software 00:06:58.227 10:38:14 -- accel/accel.sh@20 -- # IFS=: 00:06:58.227 10:38:14 -- accel/accel.sh@20 -- # read -r var val 00:06:58.227 10:38:14 -- accel/accel.sh@21 -- # val=32 00:06:58.227 10:38:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.227 10:38:14 -- accel/accel.sh@20 -- # IFS=: 00:06:58.227 10:38:14 -- accel/accel.sh@20 -- # read -r var val 00:06:58.227 10:38:14 -- accel/accel.sh@21 -- # val=32 00:06:58.227 10:38:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.227 10:38:14 -- accel/accel.sh@20 -- # IFS=: 00:06:58.227 10:38:14 -- accel/accel.sh@20 -- # read -r var val 00:06:58.227 10:38:14 -- accel/accel.sh@21 -- # val=1 00:06:58.227 10:38:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.227 10:38:14 -- accel/accel.sh@20 -- # IFS=: 00:06:58.227 10:38:14 -- accel/accel.sh@20 -- # read -r var val 00:06:58.227 10:38:14 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:58.227 10:38:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.227 10:38:14 -- accel/accel.sh@20 -- # IFS=: 00:06:58.227 10:38:14 -- accel/accel.sh@20 -- # read -r var val 00:06:58.227 10:38:14 -- accel/accel.sh@21 -- # val=Yes 00:06:58.227 10:38:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.227 10:38:14 -- accel/accel.sh@20 -- # IFS=: 00:06:58.227 10:38:14 -- accel/accel.sh@20 -- # read -r var val 00:06:58.227 10:38:14 -- accel/accel.sh@21 -- # val= 00:06:58.227 10:38:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.227 10:38:14 -- accel/accel.sh@20 -- # IFS=: 00:06:58.227 10:38:14 -- accel/accel.sh@20 -- # read -r var val 00:06:58.227 10:38:14 -- accel/accel.sh@21 -- # val= 00:06:58.227 10:38:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.227 10:38:14 -- accel/accel.sh@20 -- # IFS=: 00:06:58.227 10:38:14 -- accel/accel.sh@20 -- # read -r var val 00:06:59.606 10:38:15 -- accel/accel.sh@21 -- # val= 00:06:59.606 10:38:15 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.606 10:38:15 -- accel/accel.sh@20 -- # IFS=: 00:06:59.606 10:38:15 -- accel/accel.sh@20 -- # read -r var val 00:06:59.606 10:38:15 -- accel/accel.sh@21 -- # val= 00:06:59.606 10:38:15 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.606 10:38:15 -- accel/accel.sh@20 -- # IFS=: 00:06:59.606 10:38:15 -- accel/accel.sh@20 -- # read -r var val 00:06:59.606 10:38:15 -- accel/accel.sh@21 -- # val= 00:06:59.606 10:38:15 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.606 10:38:15 -- accel/accel.sh@20 -- # IFS=: 00:06:59.606 10:38:15 -- accel/accel.sh@20 -- # read -r var val 00:06:59.606 10:38:15 -- accel/accel.sh@21 -- # val= 00:06:59.606 10:38:15 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.606 10:38:15 -- accel/accel.sh@20 -- # IFS=: 00:06:59.606 10:38:15 -- accel/accel.sh@20 -- # read -r var val 00:06:59.606 10:38:15 -- accel/accel.sh@21 -- # val= 00:06:59.606 10:38:15 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.606 10:38:15 -- accel/accel.sh@20 -- # IFS=: 00:06:59.606 10:38:15 -- accel/accel.sh@20 -- # read -r var val 00:06:59.606 10:38:15 -- accel/accel.sh@21 -- # val= 00:06:59.606 10:38:15 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.606 10:38:15 -- accel/accel.sh@20 -- # IFS=: 00:06:59.606 10:38:15 -- accel/accel.sh@20 -- # read -r var val 00:06:59.606 10:38:15 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:59.606 10:38:15 -- accel/accel.sh@28 -- # [[ -n copy_crc32c ]] 00:06:59.606 10:38:15 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:59.606 00:06:59.606 real 0m2.576s 00:06:59.606 user 0m2.337s 00:06:59.606 sys 0m0.250s 00:06:59.606 10:38:15 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:59.606 10:38:15 -- common/autotest_common.sh@10 -- # set +x 00:06:59.606 ************************************ 00:06:59.606 END TEST accel_copy_crc32c_C2 00:06:59.606 ************************************ 00:06:59.606 10:38:15 -- accel/accel.sh@99 -- # run_test accel_dualcast accel_test -t 1 -w dualcast -y 00:06:59.606 10:38:15 -- common/autotest_common.sh@1077 -- # '[' 7 -le 1 ']' 00:06:59.606 10:38:15 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:06:59.606 10:38:15 -- common/autotest_common.sh@10 -- # set +x 00:06:59.606 ************************************ 00:06:59.606 START TEST accel_dualcast 00:06:59.606 ************************************ 00:06:59.606 10:38:15 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w dualcast -y 00:06:59.606 10:38:15 -- accel/accel.sh@16 -- # local accel_opc 00:06:59.606 10:38:15 -- accel/accel.sh@17 -- # local accel_module 00:06:59.606 10:38:15 -- accel/accel.sh@18 -- # accel_perf -t 1 -w dualcast -y 00:06:59.606 10:38:15 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dualcast -y 00:06:59.606 10:38:15 -- accel/accel.sh@12 -- # build_accel_config 00:06:59.606 10:38:15 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:59.606 10:38:15 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:59.606 10:38:15 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:59.606 10:38:15 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:59.606 10:38:15 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:59.606 10:38:15 -- accel/accel.sh@41 -- # local IFS=, 00:06:59.606 10:38:15 -- accel/accel.sh@42 -- # jq -r . 00:06:59.606 [2024-07-13 10:38:15.629182] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:06:59.606 [2024-07-13 10:38:15.629272] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1981188 ] 00:06:59.606 EAL: No free 2048 kB hugepages reported on node 1 00:06:59.606 [2024-07-13 10:38:15.698199] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:59.606 [2024-07-13 10:38:15.733993] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:00.545 10:38:16 -- accel/accel.sh@18 -- # out=' 00:07:00.545 SPDK Configuration: 00:07:00.545 Core mask: 0x1 00:07:00.545 00:07:00.545 Accel Perf Configuration: 00:07:00.545 Workload Type: dualcast 00:07:00.545 Transfer size: 4096 bytes 00:07:00.545 Vector count 1 00:07:00.545 Module: software 00:07:00.545 Queue depth: 32 00:07:00.545 Allocate depth: 32 00:07:00.545 # threads/core: 1 00:07:00.545 Run time: 1 seconds 00:07:00.545 Verify: Yes 00:07:00.545 00:07:00.545 Running for 1 seconds... 00:07:00.545 00:07:00.545 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:00.545 ------------------------------------------------------------------------------------ 00:07:00.545 0,0 676832/s 2643 MiB/s 0 0 00:07:00.545 ==================================================================================== 00:07:00.545 Total 676832/s 2643 MiB/s 0 0' 00:07:00.545 10:38:16 -- accel/accel.sh@20 -- # IFS=: 00:07:00.545 10:38:16 -- accel/accel.sh@20 -- # read -r var val 00:07:00.545 10:38:16 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dualcast -y 00:07:00.545 10:38:16 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dualcast -y 00:07:00.545 10:38:16 -- accel/accel.sh@12 -- # build_accel_config 00:07:00.545 10:38:16 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:00.545 10:38:16 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:00.545 10:38:16 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:00.545 10:38:16 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:00.545 10:38:16 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:00.545 10:38:16 -- accel/accel.sh@41 -- # local IFS=, 00:07:00.545 10:38:16 -- accel/accel.sh@42 -- # jq -r . 00:07:00.545 [2024-07-13 10:38:16.913734] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:07:00.545 [2024-07-13 10:38:16.913825] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1981326 ] 00:07:00.805 EAL: No free 2048 kB hugepages reported on node 1 00:07:00.805 [2024-07-13 10:38:16.982876] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:00.805 [2024-07-13 10:38:17.017651] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:00.805 10:38:17 -- accel/accel.sh@21 -- # val= 00:07:00.805 10:38:17 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.805 10:38:17 -- accel/accel.sh@20 -- # IFS=: 00:07:00.805 10:38:17 -- accel/accel.sh@20 -- # read -r var val 00:07:00.805 10:38:17 -- accel/accel.sh@21 -- # val= 00:07:00.805 10:38:17 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.805 10:38:17 -- accel/accel.sh@20 -- # IFS=: 00:07:00.805 10:38:17 -- accel/accel.sh@20 -- # read -r var val 00:07:00.805 10:38:17 -- accel/accel.sh@21 -- # val=0x1 00:07:00.805 10:38:17 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.805 10:38:17 -- accel/accel.sh@20 -- # IFS=: 00:07:00.805 10:38:17 -- accel/accel.sh@20 -- # read -r var val 00:07:00.805 10:38:17 -- accel/accel.sh@21 -- # val= 00:07:00.805 10:38:17 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.805 10:38:17 -- accel/accel.sh@20 -- # IFS=: 00:07:00.805 10:38:17 -- accel/accel.sh@20 -- # read -r var val 00:07:00.805 10:38:17 -- accel/accel.sh@21 -- # val= 00:07:00.805 10:38:17 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.805 10:38:17 -- accel/accel.sh@20 -- # IFS=: 00:07:00.805 10:38:17 -- accel/accel.sh@20 -- # read -r var val 00:07:00.805 10:38:17 -- accel/accel.sh@21 -- # val=dualcast 00:07:00.805 10:38:17 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.805 10:38:17 -- accel/accel.sh@24 -- # accel_opc=dualcast 00:07:00.805 10:38:17 -- accel/accel.sh@20 -- # IFS=: 00:07:00.805 10:38:17 -- accel/accel.sh@20 -- # read -r var val 00:07:00.805 10:38:17 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:00.805 10:38:17 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.805 10:38:17 -- accel/accel.sh@20 -- # IFS=: 00:07:00.805 10:38:17 -- accel/accel.sh@20 -- # read -r var val 00:07:00.805 10:38:17 -- accel/accel.sh@21 -- # val= 00:07:00.805 10:38:17 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.805 10:38:17 -- accel/accel.sh@20 -- # IFS=: 00:07:00.805 10:38:17 -- accel/accel.sh@20 -- # read -r var val 00:07:00.805 10:38:17 -- accel/accel.sh@21 -- # val=software 00:07:00.805 10:38:17 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.805 10:38:17 -- accel/accel.sh@23 -- # accel_module=software 00:07:00.805 10:38:17 -- accel/accel.sh@20 -- # IFS=: 00:07:00.805 10:38:17 -- accel/accel.sh@20 -- # read -r var val 00:07:00.805 10:38:17 -- accel/accel.sh@21 -- # val=32 00:07:00.805 10:38:17 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.805 10:38:17 -- accel/accel.sh@20 -- # IFS=: 00:07:00.805 10:38:17 -- accel/accel.sh@20 -- # read -r var val 00:07:00.805 10:38:17 -- accel/accel.sh@21 -- # val=32 00:07:00.805 10:38:17 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.805 10:38:17 -- accel/accel.sh@20 -- # IFS=: 00:07:00.805 10:38:17 -- accel/accel.sh@20 -- # read -r var val 00:07:00.805 10:38:17 -- accel/accel.sh@21 -- # val=1 00:07:00.805 10:38:17 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.805 10:38:17 -- accel/accel.sh@20 -- # IFS=: 00:07:00.805 10:38:17 -- accel/accel.sh@20 -- # read -r var val 00:07:00.805 10:38:17 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:00.805 10:38:17 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.805 10:38:17 -- accel/accel.sh@20 -- # IFS=: 00:07:00.805 10:38:17 -- accel/accel.sh@20 -- # read -r var val 00:07:00.805 10:38:17 -- accel/accel.sh@21 -- # val=Yes 00:07:00.805 10:38:17 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.805 10:38:17 -- accel/accel.sh@20 -- # IFS=: 00:07:00.805 10:38:17 -- accel/accel.sh@20 -- # read -r var val 00:07:00.805 10:38:17 -- accel/accel.sh@21 -- # val= 00:07:00.805 10:38:17 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.805 10:38:17 -- accel/accel.sh@20 -- # IFS=: 00:07:00.805 10:38:17 -- accel/accel.sh@20 -- # read -r var val 00:07:00.805 10:38:17 -- accel/accel.sh@21 -- # val= 00:07:00.805 10:38:17 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.805 10:38:17 -- accel/accel.sh@20 -- # IFS=: 00:07:00.805 10:38:17 -- accel/accel.sh@20 -- # read -r var val 00:07:02.189 10:38:18 -- accel/accel.sh@21 -- # val= 00:07:02.189 10:38:18 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.189 10:38:18 -- accel/accel.sh@20 -- # IFS=: 00:07:02.189 10:38:18 -- accel/accel.sh@20 -- # read -r var val 00:07:02.189 10:38:18 -- accel/accel.sh@21 -- # val= 00:07:02.189 10:38:18 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.189 10:38:18 -- accel/accel.sh@20 -- # IFS=: 00:07:02.189 10:38:18 -- accel/accel.sh@20 -- # read -r var val 00:07:02.189 10:38:18 -- accel/accel.sh@21 -- # val= 00:07:02.189 10:38:18 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.189 10:38:18 -- accel/accel.sh@20 -- # IFS=: 00:07:02.189 10:38:18 -- accel/accel.sh@20 -- # read -r var val 00:07:02.189 10:38:18 -- accel/accel.sh@21 -- # val= 00:07:02.189 10:38:18 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.189 10:38:18 -- accel/accel.sh@20 -- # IFS=: 00:07:02.189 10:38:18 -- accel/accel.sh@20 -- # read -r var val 00:07:02.189 10:38:18 -- accel/accel.sh@21 -- # val= 00:07:02.189 10:38:18 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.189 10:38:18 -- accel/accel.sh@20 -- # IFS=: 00:07:02.189 10:38:18 -- accel/accel.sh@20 -- # read -r var val 00:07:02.189 10:38:18 -- accel/accel.sh@21 -- # val= 00:07:02.189 10:38:18 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.189 10:38:18 -- accel/accel.sh@20 -- # IFS=: 00:07:02.189 10:38:18 -- accel/accel.sh@20 -- # read -r var val 00:07:02.189 10:38:18 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:02.189 10:38:18 -- accel/accel.sh@28 -- # [[ -n dualcast ]] 00:07:02.189 10:38:18 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:02.189 00:07:02.189 real 0m2.576s 00:07:02.189 user 0m2.328s 00:07:02.189 sys 0m0.256s 00:07:02.189 10:38:18 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:02.189 10:38:18 -- common/autotest_common.sh@10 -- # set +x 00:07:02.189 ************************************ 00:07:02.189 END TEST accel_dualcast 00:07:02.189 ************************************ 00:07:02.189 10:38:18 -- accel/accel.sh@100 -- # run_test accel_compare accel_test -t 1 -w compare -y 00:07:02.189 10:38:18 -- common/autotest_common.sh@1077 -- # '[' 7 -le 1 ']' 00:07:02.189 10:38:18 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:02.189 10:38:18 -- common/autotest_common.sh@10 -- # set +x 00:07:02.189 ************************************ 00:07:02.189 START TEST accel_compare 00:07:02.189 ************************************ 00:07:02.189 10:38:18 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w compare -y 00:07:02.189 10:38:18 -- accel/accel.sh@16 -- # local accel_opc 00:07:02.189 10:38:18 -- accel/accel.sh@17 -- # local accel_module 00:07:02.189 10:38:18 -- accel/accel.sh@18 -- # accel_perf -t 1 -w compare -y 00:07:02.189 10:38:18 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compare -y 00:07:02.189 10:38:18 -- accel/accel.sh@12 -- # build_accel_config 00:07:02.189 10:38:18 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:02.189 10:38:18 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:02.189 10:38:18 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:02.189 10:38:18 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:02.189 10:38:18 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:02.189 10:38:18 -- accel/accel.sh@41 -- # local IFS=, 00:07:02.189 10:38:18 -- accel/accel.sh@42 -- # jq -r . 00:07:02.189 [2024-07-13 10:38:18.255055] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:07:02.189 [2024-07-13 10:38:18.255149] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1981531 ] 00:07:02.189 EAL: No free 2048 kB hugepages reported on node 1 00:07:02.189 [2024-07-13 10:38:18.325677] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:02.189 [2024-07-13 10:38:18.361696] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:03.564 10:38:19 -- accel/accel.sh@18 -- # out=' 00:07:03.564 SPDK Configuration: 00:07:03.564 Core mask: 0x1 00:07:03.564 00:07:03.564 Accel Perf Configuration: 00:07:03.564 Workload Type: compare 00:07:03.564 Transfer size: 4096 bytes 00:07:03.564 Vector count 1 00:07:03.564 Module: software 00:07:03.564 Queue depth: 32 00:07:03.564 Allocate depth: 32 00:07:03.564 # threads/core: 1 00:07:03.564 Run time: 1 seconds 00:07:03.564 Verify: Yes 00:07:03.564 00:07:03.564 Running for 1 seconds... 00:07:03.564 00:07:03.564 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:03.564 ------------------------------------------------------------------------------------ 00:07:03.564 0,0 845440/s 3302 MiB/s 0 0 00:07:03.564 ==================================================================================== 00:07:03.564 Total 845440/s 3302 MiB/s 0 0' 00:07:03.564 10:38:19 -- accel/accel.sh@20 -- # IFS=: 00:07:03.564 10:38:19 -- accel/accel.sh@20 -- # read -r var val 00:07:03.564 10:38:19 -- accel/accel.sh@15 -- # accel_perf -t 1 -w compare -y 00:07:03.564 10:38:19 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compare -y 00:07:03.564 10:38:19 -- accel/accel.sh@12 -- # build_accel_config 00:07:03.564 10:38:19 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:03.564 10:38:19 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:03.564 10:38:19 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:03.564 10:38:19 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:03.564 10:38:19 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:03.564 10:38:19 -- accel/accel.sh@41 -- # local IFS=, 00:07:03.564 10:38:19 -- accel/accel.sh@42 -- # jq -r . 00:07:03.564 [2024-07-13 10:38:19.541922] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:07:03.564 [2024-07-13 10:38:19.542012] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1981800 ] 00:07:03.564 EAL: No free 2048 kB hugepages reported on node 1 00:07:03.564 [2024-07-13 10:38:19.611356] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:03.564 [2024-07-13 10:38:19.645688] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:03.564 10:38:19 -- accel/accel.sh@21 -- # val= 00:07:03.564 10:38:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.564 10:38:19 -- accel/accel.sh@20 -- # IFS=: 00:07:03.564 10:38:19 -- accel/accel.sh@20 -- # read -r var val 00:07:03.564 10:38:19 -- accel/accel.sh@21 -- # val= 00:07:03.564 10:38:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.564 10:38:19 -- accel/accel.sh@20 -- # IFS=: 00:07:03.564 10:38:19 -- accel/accel.sh@20 -- # read -r var val 00:07:03.564 10:38:19 -- accel/accel.sh@21 -- # val=0x1 00:07:03.564 10:38:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.564 10:38:19 -- accel/accel.sh@20 -- # IFS=: 00:07:03.564 10:38:19 -- accel/accel.sh@20 -- # read -r var val 00:07:03.564 10:38:19 -- accel/accel.sh@21 -- # val= 00:07:03.564 10:38:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.564 10:38:19 -- accel/accel.sh@20 -- # IFS=: 00:07:03.564 10:38:19 -- accel/accel.sh@20 -- # read -r var val 00:07:03.564 10:38:19 -- accel/accel.sh@21 -- # val= 00:07:03.564 10:38:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.564 10:38:19 -- accel/accel.sh@20 -- # IFS=: 00:07:03.564 10:38:19 -- accel/accel.sh@20 -- # read -r var val 00:07:03.564 10:38:19 -- accel/accel.sh@21 -- # val=compare 00:07:03.564 10:38:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.564 10:38:19 -- accel/accel.sh@24 -- # accel_opc=compare 00:07:03.564 10:38:19 -- accel/accel.sh@20 -- # IFS=: 00:07:03.564 10:38:19 -- accel/accel.sh@20 -- # read -r var val 00:07:03.564 10:38:19 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:03.564 10:38:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.564 10:38:19 -- accel/accel.sh@20 -- # IFS=: 00:07:03.564 10:38:19 -- accel/accel.sh@20 -- # read -r var val 00:07:03.564 10:38:19 -- accel/accel.sh@21 -- # val= 00:07:03.564 10:38:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.564 10:38:19 -- accel/accel.sh@20 -- # IFS=: 00:07:03.564 10:38:19 -- accel/accel.sh@20 -- # read -r var val 00:07:03.564 10:38:19 -- accel/accel.sh@21 -- # val=software 00:07:03.564 10:38:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.564 10:38:19 -- accel/accel.sh@23 -- # accel_module=software 00:07:03.564 10:38:19 -- accel/accel.sh@20 -- # IFS=: 00:07:03.564 10:38:19 -- accel/accel.sh@20 -- # read -r var val 00:07:03.564 10:38:19 -- accel/accel.sh@21 -- # val=32 00:07:03.564 10:38:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.564 10:38:19 -- accel/accel.sh@20 -- # IFS=: 00:07:03.564 10:38:19 -- accel/accel.sh@20 -- # read -r var val 00:07:03.564 10:38:19 -- accel/accel.sh@21 -- # val=32 00:07:03.564 10:38:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.564 10:38:19 -- accel/accel.sh@20 -- # IFS=: 00:07:03.564 10:38:19 -- accel/accel.sh@20 -- # read -r var val 00:07:03.564 10:38:19 -- accel/accel.sh@21 -- # val=1 00:07:03.564 10:38:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.564 10:38:19 -- accel/accel.sh@20 -- # IFS=: 00:07:03.564 10:38:19 -- accel/accel.sh@20 -- # read -r var val 00:07:03.564 10:38:19 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:03.564 10:38:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.564 10:38:19 -- accel/accel.sh@20 -- # IFS=: 00:07:03.564 10:38:19 -- accel/accel.sh@20 -- # read -r var val 00:07:03.564 10:38:19 -- accel/accel.sh@21 -- # val=Yes 00:07:03.564 10:38:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.564 10:38:19 -- accel/accel.sh@20 -- # IFS=: 00:07:03.564 10:38:19 -- accel/accel.sh@20 -- # read -r var val 00:07:03.564 10:38:19 -- accel/accel.sh@21 -- # val= 00:07:03.564 10:38:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.564 10:38:19 -- accel/accel.sh@20 -- # IFS=: 00:07:03.564 10:38:19 -- accel/accel.sh@20 -- # read -r var val 00:07:03.564 10:38:19 -- accel/accel.sh@21 -- # val= 00:07:03.564 10:38:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.564 10:38:19 -- accel/accel.sh@20 -- # IFS=: 00:07:03.564 10:38:19 -- accel/accel.sh@20 -- # read -r var val 00:07:04.502 10:38:20 -- accel/accel.sh@21 -- # val= 00:07:04.502 10:38:20 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.502 10:38:20 -- accel/accel.sh@20 -- # IFS=: 00:07:04.502 10:38:20 -- accel/accel.sh@20 -- # read -r var val 00:07:04.502 10:38:20 -- accel/accel.sh@21 -- # val= 00:07:04.502 10:38:20 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.502 10:38:20 -- accel/accel.sh@20 -- # IFS=: 00:07:04.502 10:38:20 -- accel/accel.sh@20 -- # read -r var val 00:07:04.502 10:38:20 -- accel/accel.sh@21 -- # val= 00:07:04.502 10:38:20 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.502 10:38:20 -- accel/accel.sh@20 -- # IFS=: 00:07:04.502 10:38:20 -- accel/accel.sh@20 -- # read -r var val 00:07:04.502 10:38:20 -- accel/accel.sh@21 -- # val= 00:07:04.502 10:38:20 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.502 10:38:20 -- accel/accel.sh@20 -- # IFS=: 00:07:04.502 10:38:20 -- accel/accel.sh@20 -- # read -r var val 00:07:04.502 10:38:20 -- accel/accel.sh@21 -- # val= 00:07:04.502 10:38:20 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.502 10:38:20 -- accel/accel.sh@20 -- # IFS=: 00:07:04.502 10:38:20 -- accel/accel.sh@20 -- # read -r var val 00:07:04.502 10:38:20 -- accel/accel.sh@21 -- # val= 00:07:04.502 10:38:20 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.502 10:38:20 -- accel/accel.sh@20 -- # IFS=: 00:07:04.502 10:38:20 -- accel/accel.sh@20 -- # read -r var val 00:07:04.502 10:38:20 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:04.502 10:38:20 -- accel/accel.sh@28 -- # [[ -n compare ]] 00:07:04.502 10:38:20 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:04.502 00:07:04.502 real 0m2.578s 00:07:04.502 user 0m2.333s 00:07:04.502 sys 0m0.252s 00:07:04.502 10:38:20 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:04.502 10:38:20 -- common/autotest_common.sh@10 -- # set +x 00:07:04.502 ************************************ 00:07:04.502 END TEST accel_compare 00:07:04.502 ************************************ 00:07:04.502 10:38:20 -- accel/accel.sh@101 -- # run_test accel_xor accel_test -t 1 -w xor -y 00:07:04.502 10:38:20 -- common/autotest_common.sh@1077 -- # '[' 7 -le 1 ']' 00:07:04.502 10:38:20 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:04.502 10:38:20 -- common/autotest_common.sh@10 -- # set +x 00:07:04.502 ************************************ 00:07:04.502 START TEST accel_xor 00:07:04.502 ************************************ 00:07:04.502 10:38:20 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w xor -y 00:07:04.502 10:38:20 -- accel/accel.sh@16 -- # local accel_opc 00:07:04.502 10:38:20 -- accel/accel.sh@17 -- # local accel_module 00:07:04.502 10:38:20 -- accel/accel.sh@18 -- # accel_perf -t 1 -w xor -y 00:07:04.502 10:38:20 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y 00:07:04.502 10:38:20 -- accel/accel.sh@12 -- # build_accel_config 00:07:04.502 10:38:20 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:04.502 10:38:20 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:04.502 10:38:20 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:04.502 10:38:20 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:04.502 10:38:20 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:04.502 10:38:20 -- accel/accel.sh@41 -- # local IFS=, 00:07:04.502 10:38:20 -- accel/accel.sh@42 -- # jq -r . 00:07:04.502 [2024-07-13 10:38:20.880155] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:07:04.502 [2024-07-13 10:38:20.880254] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1982083 ] 00:07:04.761 EAL: No free 2048 kB hugepages reported on node 1 00:07:04.761 [2024-07-13 10:38:20.948735] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:04.761 [2024-07-13 10:38:20.984219] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:06.139 10:38:22 -- accel/accel.sh@18 -- # out=' 00:07:06.139 SPDK Configuration: 00:07:06.139 Core mask: 0x1 00:07:06.139 00:07:06.139 Accel Perf Configuration: 00:07:06.139 Workload Type: xor 00:07:06.139 Source buffers: 2 00:07:06.139 Transfer size: 4096 bytes 00:07:06.139 Vector count 1 00:07:06.139 Module: software 00:07:06.139 Queue depth: 32 00:07:06.139 Allocate depth: 32 00:07:06.139 # threads/core: 1 00:07:06.139 Run time: 1 seconds 00:07:06.139 Verify: Yes 00:07:06.139 00:07:06.139 Running for 1 seconds... 00:07:06.139 00:07:06.139 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:06.139 ------------------------------------------------------------------------------------ 00:07:06.139 0,0 669728/s 2616 MiB/s 0 0 00:07:06.140 ==================================================================================== 00:07:06.140 Total 669728/s 2616 MiB/s 0 0' 00:07:06.140 10:38:22 -- accel/accel.sh@20 -- # IFS=: 00:07:06.140 10:38:22 -- accel/accel.sh@20 -- # read -r var val 00:07:06.140 10:38:22 -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y 00:07:06.140 10:38:22 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y 00:07:06.140 10:38:22 -- accel/accel.sh@12 -- # build_accel_config 00:07:06.140 10:38:22 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:06.140 10:38:22 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:06.140 10:38:22 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:06.140 10:38:22 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:06.140 10:38:22 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:06.140 10:38:22 -- accel/accel.sh@41 -- # local IFS=, 00:07:06.140 10:38:22 -- accel/accel.sh@42 -- # jq -r . 00:07:06.140 [2024-07-13 10:38:22.163647] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:07:06.140 [2024-07-13 10:38:22.163738] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1982349 ] 00:07:06.140 EAL: No free 2048 kB hugepages reported on node 1 00:07:06.140 [2024-07-13 10:38:22.231614] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:06.140 [2024-07-13 10:38:22.265634] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:06.140 10:38:22 -- accel/accel.sh@21 -- # val= 00:07:06.140 10:38:22 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.140 10:38:22 -- accel/accel.sh@20 -- # IFS=: 00:07:06.140 10:38:22 -- accel/accel.sh@20 -- # read -r var val 00:07:06.140 10:38:22 -- accel/accel.sh@21 -- # val= 00:07:06.140 10:38:22 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.140 10:38:22 -- accel/accel.sh@20 -- # IFS=: 00:07:06.140 10:38:22 -- accel/accel.sh@20 -- # read -r var val 00:07:06.140 10:38:22 -- accel/accel.sh@21 -- # val=0x1 00:07:06.140 10:38:22 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.140 10:38:22 -- accel/accel.sh@20 -- # IFS=: 00:07:06.140 10:38:22 -- accel/accel.sh@20 -- # read -r var val 00:07:06.140 10:38:22 -- accel/accel.sh@21 -- # val= 00:07:06.140 10:38:22 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.140 10:38:22 -- accel/accel.sh@20 -- # IFS=: 00:07:06.140 10:38:22 -- accel/accel.sh@20 -- # read -r var val 00:07:06.140 10:38:22 -- accel/accel.sh@21 -- # val= 00:07:06.140 10:38:22 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.140 10:38:22 -- accel/accel.sh@20 -- # IFS=: 00:07:06.140 10:38:22 -- accel/accel.sh@20 -- # read -r var val 00:07:06.140 10:38:22 -- accel/accel.sh@21 -- # val=xor 00:07:06.140 10:38:22 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.140 10:38:22 -- accel/accel.sh@24 -- # accel_opc=xor 00:07:06.140 10:38:22 -- accel/accel.sh@20 -- # IFS=: 00:07:06.140 10:38:22 -- accel/accel.sh@20 -- # read -r var val 00:07:06.140 10:38:22 -- accel/accel.sh@21 -- # val=2 00:07:06.140 10:38:22 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.140 10:38:22 -- accel/accel.sh@20 -- # IFS=: 00:07:06.140 10:38:22 -- accel/accel.sh@20 -- # read -r var val 00:07:06.140 10:38:22 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:06.140 10:38:22 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.140 10:38:22 -- accel/accel.sh@20 -- # IFS=: 00:07:06.140 10:38:22 -- accel/accel.sh@20 -- # read -r var val 00:07:06.140 10:38:22 -- accel/accel.sh@21 -- # val= 00:07:06.140 10:38:22 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.140 10:38:22 -- accel/accel.sh@20 -- # IFS=: 00:07:06.140 10:38:22 -- accel/accel.sh@20 -- # read -r var val 00:07:06.140 10:38:22 -- accel/accel.sh@21 -- # val=software 00:07:06.140 10:38:22 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.140 10:38:22 -- accel/accel.sh@23 -- # accel_module=software 00:07:06.140 10:38:22 -- accel/accel.sh@20 -- # IFS=: 00:07:06.140 10:38:22 -- accel/accel.sh@20 -- # read -r var val 00:07:06.140 10:38:22 -- accel/accel.sh@21 -- # val=32 00:07:06.140 10:38:22 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.140 10:38:22 -- accel/accel.sh@20 -- # IFS=: 00:07:06.140 10:38:22 -- accel/accel.sh@20 -- # read -r var val 00:07:06.140 10:38:22 -- accel/accel.sh@21 -- # val=32 00:07:06.140 10:38:22 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.140 10:38:22 -- accel/accel.sh@20 -- # IFS=: 00:07:06.140 10:38:22 -- accel/accel.sh@20 -- # read -r var val 00:07:06.140 10:38:22 -- accel/accel.sh@21 -- # val=1 00:07:06.140 10:38:22 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.140 10:38:22 -- accel/accel.sh@20 -- # IFS=: 00:07:06.140 10:38:22 -- accel/accel.sh@20 -- # read -r var val 00:07:06.140 10:38:22 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:06.140 10:38:22 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.140 10:38:22 -- accel/accel.sh@20 -- # IFS=: 00:07:06.140 10:38:22 -- accel/accel.sh@20 -- # read -r var val 00:07:06.140 10:38:22 -- accel/accel.sh@21 -- # val=Yes 00:07:06.140 10:38:22 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.140 10:38:22 -- accel/accel.sh@20 -- # IFS=: 00:07:06.140 10:38:22 -- accel/accel.sh@20 -- # read -r var val 00:07:06.140 10:38:22 -- accel/accel.sh@21 -- # val= 00:07:06.140 10:38:22 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.140 10:38:22 -- accel/accel.sh@20 -- # IFS=: 00:07:06.140 10:38:22 -- accel/accel.sh@20 -- # read -r var val 00:07:06.140 10:38:22 -- accel/accel.sh@21 -- # val= 00:07:06.140 10:38:22 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.140 10:38:22 -- accel/accel.sh@20 -- # IFS=: 00:07:06.140 10:38:22 -- accel/accel.sh@20 -- # read -r var val 00:07:07.075 10:38:23 -- accel/accel.sh@21 -- # val= 00:07:07.075 10:38:23 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.075 10:38:23 -- accel/accel.sh@20 -- # IFS=: 00:07:07.075 10:38:23 -- accel/accel.sh@20 -- # read -r var val 00:07:07.075 10:38:23 -- accel/accel.sh@21 -- # val= 00:07:07.075 10:38:23 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.075 10:38:23 -- accel/accel.sh@20 -- # IFS=: 00:07:07.075 10:38:23 -- accel/accel.sh@20 -- # read -r var val 00:07:07.075 10:38:23 -- accel/accel.sh@21 -- # val= 00:07:07.075 10:38:23 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.075 10:38:23 -- accel/accel.sh@20 -- # IFS=: 00:07:07.075 10:38:23 -- accel/accel.sh@20 -- # read -r var val 00:07:07.075 10:38:23 -- accel/accel.sh@21 -- # val= 00:07:07.075 10:38:23 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.075 10:38:23 -- accel/accel.sh@20 -- # IFS=: 00:07:07.075 10:38:23 -- accel/accel.sh@20 -- # read -r var val 00:07:07.075 10:38:23 -- accel/accel.sh@21 -- # val= 00:07:07.075 10:38:23 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.075 10:38:23 -- accel/accel.sh@20 -- # IFS=: 00:07:07.075 10:38:23 -- accel/accel.sh@20 -- # read -r var val 00:07:07.075 10:38:23 -- accel/accel.sh@21 -- # val= 00:07:07.075 10:38:23 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.075 10:38:23 -- accel/accel.sh@20 -- # IFS=: 00:07:07.075 10:38:23 -- accel/accel.sh@20 -- # read -r var val 00:07:07.075 10:38:23 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:07.075 10:38:23 -- accel/accel.sh@28 -- # [[ -n xor ]] 00:07:07.075 10:38:23 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:07.075 00:07:07.075 real 0m2.572s 00:07:07.075 user 0m2.327s 00:07:07.075 sys 0m0.254s 00:07:07.075 10:38:23 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:07.075 10:38:23 -- common/autotest_common.sh@10 -- # set +x 00:07:07.075 ************************************ 00:07:07.075 END TEST accel_xor 00:07:07.075 ************************************ 00:07:07.333 10:38:23 -- accel/accel.sh@102 -- # run_test accel_xor accel_test -t 1 -w xor -y -x 3 00:07:07.333 10:38:23 -- common/autotest_common.sh@1077 -- # '[' 9 -le 1 ']' 00:07:07.333 10:38:23 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:07.333 10:38:23 -- common/autotest_common.sh@10 -- # set +x 00:07:07.333 ************************************ 00:07:07.333 START TEST accel_xor 00:07:07.333 ************************************ 00:07:07.333 10:38:23 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w xor -y -x 3 00:07:07.333 10:38:23 -- accel/accel.sh@16 -- # local accel_opc 00:07:07.333 10:38:23 -- accel/accel.sh@17 -- # local accel_module 00:07:07.333 10:38:23 -- accel/accel.sh@18 -- # accel_perf -t 1 -w xor -y -x 3 00:07:07.333 10:38:23 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x 3 00:07:07.333 10:38:23 -- accel/accel.sh@12 -- # build_accel_config 00:07:07.333 10:38:23 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:07.333 10:38:23 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:07.333 10:38:23 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:07.333 10:38:23 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:07.333 10:38:23 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:07.333 10:38:23 -- accel/accel.sh@41 -- # local IFS=, 00:07:07.333 10:38:23 -- accel/accel.sh@42 -- # jq -r . 00:07:07.333 [2024-07-13 10:38:23.501087] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:07:07.333 [2024-07-13 10:38:23.501182] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1982638 ] 00:07:07.333 EAL: No free 2048 kB hugepages reported on node 1 00:07:07.333 [2024-07-13 10:38:23.571228] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:07.333 [2024-07-13 10:38:23.607246] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:08.709 10:38:24 -- accel/accel.sh@18 -- # out=' 00:07:08.709 SPDK Configuration: 00:07:08.709 Core mask: 0x1 00:07:08.709 00:07:08.709 Accel Perf Configuration: 00:07:08.709 Workload Type: xor 00:07:08.709 Source buffers: 3 00:07:08.709 Transfer size: 4096 bytes 00:07:08.709 Vector count 1 00:07:08.709 Module: software 00:07:08.709 Queue depth: 32 00:07:08.709 Allocate depth: 32 00:07:08.709 # threads/core: 1 00:07:08.709 Run time: 1 seconds 00:07:08.709 Verify: Yes 00:07:08.709 00:07:08.709 Running for 1 seconds... 00:07:08.709 00:07:08.709 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:08.709 ------------------------------------------------------------------------------------ 00:07:08.709 0,0 645408/s 2521 MiB/s 0 0 00:07:08.709 ==================================================================================== 00:07:08.709 Total 645408/s 2521 MiB/s 0 0' 00:07:08.709 10:38:24 -- accel/accel.sh@20 -- # IFS=: 00:07:08.709 10:38:24 -- accel/accel.sh@20 -- # read -r var val 00:07:08.709 10:38:24 -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y -x 3 00:07:08.709 10:38:24 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x 3 00:07:08.709 10:38:24 -- accel/accel.sh@12 -- # build_accel_config 00:07:08.709 10:38:24 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:08.709 10:38:24 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:08.709 10:38:24 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:08.709 10:38:24 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:08.709 10:38:24 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:08.709 10:38:24 -- accel/accel.sh@41 -- # local IFS=, 00:07:08.709 10:38:24 -- accel/accel.sh@42 -- # jq -r . 00:07:08.709 [2024-07-13 10:38:24.788477] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:07:08.709 [2024-07-13 10:38:24.788568] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1982828 ] 00:07:08.709 EAL: No free 2048 kB hugepages reported on node 1 00:07:08.709 [2024-07-13 10:38:24.858838] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:08.709 [2024-07-13 10:38:24.893156] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:08.709 10:38:24 -- accel/accel.sh@21 -- # val= 00:07:08.709 10:38:24 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.709 10:38:24 -- accel/accel.sh@20 -- # IFS=: 00:07:08.709 10:38:24 -- accel/accel.sh@20 -- # read -r var val 00:07:08.709 10:38:24 -- accel/accel.sh@21 -- # val= 00:07:08.709 10:38:24 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.709 10:38:24 -- accel/accel.sh@20 -- # IFS=: 00:07:08.709 10:38:24 -- accel/accel.sh@20 -- # read -r var val 00:07:08.709 10:38:24 -- accel/accel.sh@21 -- # val=0x1 00:07:08.709 10:38:24 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.709 10:38:24 -- accel/accel.sh@20 -- # IFS=: 00:07:08.709 10:38:24 -- accel/accel.sh@20 -- # read -r var val 00:07:08.709 10:38:24 -- accel/accel.sh@21 -- # val= 00:07:08.709 10:38:24 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.709 10:38:24 -- accel/accel.sh@20 -- # IFS=: 00:07:08.709 10:38:24 -- accel/accel.sh@20 -- # read -r var val 00:07:08.709 10:38:24 -- accel/accel.sh@21 -- # val= 00:07:08.709 10:38:24 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.709 10:38:24 -- accel/accel.sh@20 -- # IFS=: 00:07:08.709 10:38:24 -- accel/accel.sh@20 -- # read -r var val 00:07:08.709 10:38:24 -- accel/accel.sh@21 -- # val=xor 00:07:08.709 10:38:24 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.709 10:38:24 -- accel/accel.sh@24 -- # accel_opc=xor 00:07:08.709 10:38:24 -- accel/accel.sh@20 -- # IFS=: 00:07:08.709 10:38:24 -- accel/accel.sh@20 -- # read -r var val 00:07:08.709 10:38:24 -- accel/accel.sh@21 -- # val=3 00:07:08.709 10:38:24 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.709 10:38:24 -- accel/accel.sh@20 -- # IFS=: 00:07:08.709 10:38:24 -- accel/accel.sh@20 -- # read -r var val 00:07:08.709 10:38:24 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:08.709 10:38:24 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.709 10:38:24 -- accel/accel.sh@20 -- # IFS=: 00:07:08.709 10:38:24 -- accel/accel.sh@20 -- # read -r var val 00:07:08.709 10:38:24 -- accel/accel.sh@21 -- # val= 00:07:08.709 10:38:24 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.709 10:38:24 -- accel/accel.sh@20 -- # IFS=: 00:07:08.709 10:38:24 -- accel/accel.sh@20 -- # read -r var val 00:07:08.709 10:38:24 -- accel/accel.sh@21 -- # val=software 00:07:08.709 10:38:24 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.709 10:38:24 -- accel/accel.sh@23 -- # accel_module=software 00:07:08.709 10:38:24 -- accel/accel.sh@20 -- # IFS=: 00:07:08.709 10:38:24 -- accel/accel.sh@20 -- # read -r var val 00:07:08.709 10:38:24 -- accel/accel.sh@21 -- # val=32 00:07:08.709 10:38:24 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.709 10:38:24 -- accel/accel.sh@20 -- # IFS=: 00:07:08.709 10:38:24 -- accel/accel.sh@20 -- # read -r var val 00:07:08.709 10:38:24 -- accel/accel.sh@21 -- # val=32 00:07:08.709 10:38:24 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.710 10:38:24 -- accel/accel.sh@20 -- # IFS=: 00:07:08.710 10:38:24 -- accel/accel.sh@20 -- # read -r var val 00:07:08.710 10:38:24 -- accel/accel.sh@21 -- # val=1 00:07:08.710 10:38:24 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.710 10:38:24 -- accel/accel.sh@20 -- # IFS=: 00:07:08.710 10:38:24 -- accel/accel.sh@20 -- # read -r var val 00:07:08.710 10:38:24 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:08.710 10:38:24 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.710 10:38:24 -- accel/accel.sh@20 -- # IFS=: 00:07:08.710 10:38:24 -- accel/accel.sh@20 -- # read -r var val 00:07:08.710 10:38:24 -- accel/accel.sh@21 -- # val=Yes 00:07:08.710 10:38:24 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.710 10:38:24 -- accel/accel.sh@20 -- # IFS=: 00:07:08.710 10:38:24 -- accel/accel.sh@20 -- # read -r var val 00:07:08.710 10:38:24 -- accel/accel.sh@21 -- # val= 00:07:08.710 10:38:24 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.710 10:38:24 -- accel/accel.sh@20 -- # IFS=: 00:07:08.710 10:38:24 -- accel/accel.sh@20 -- # read -r var val 00:07:08.710 10:38:24 -- accel/accel.sh@21 -- # val= 00:07:08.710 10:38:24 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.710 10:38:24 -- accel/accel.sh@20 -- # IFS=: 00:07:08.710 10:38:24 -- accel/accel.sh@20 -- # read -r var val 00:07:10.086 10:38:26 -- accel/accel.sh@21 -- # val= 00:07:10.086 10:38:26 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.086 10:38:26 -- accel/accel.sh@20 -- # IFS=: 00:07:10.086 10:38:26 -- accel/accel.sh@20 -- # read -r var val 00:07:10.086 10:38:26 -- accel/accel.sh@21 -- # val= 00:07:10.086 10:38:26 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.086 10:38:26 -- accel/accel.sh@20 -- # IFS=: 00:07:10.086 10:38:26 -- accel/accel.sh@20 -- # read -r var val 00:07:10.086 10:38:26 -- accel/accel.sh@21 -- # val= 00:07:10.086 10:38:26 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.086 10:38:26 -- accel/accel.sh@20 -- # IFS=: 00:07:10.086 10:38:26 -- accel/accel.sh@20 -- # read -r var val 00:07:10.086 10:38:26 -- accel/accel.sh@21 -- # val= 00:07:10.086 10:38:26 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.086 10:38:26 -- accel/accel.sh@20 -- # IFS=: 00:07:10.086 10:38:26 -- accel/accel.sh@20 -- # read -r var val 00:07:10.086 10:38:26 -- accel/accel.sh@21 -- # val= 00:07:10.086 10:38:26 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.086 10:38:26 -- accel/accel.sh@20 -- # IFS=: 00:07:10.086 10:38:26 -- accel/accel.sh@20 -- # read -r var val 00:07:10.086 10:38:26 -- accel/accel.sh@21 -- # val= 00:07:10.086 10:38:26 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.086 10:38:26 -- accel/accel.sh@20 -- # IFS=: 00:07:10.086 10:38:26 -- accel/accel.sh@20 -- # read -r var val 00:07:10.086 10:38:26 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:10.086 10:38:26 -- accel/accel.sh@28 -- # [[ -n xor ]] 00:07:10.086 10:38:26 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:10.086 00:07:10.086 real 0m2.579s 00:07:10.086 user 0m2.327s 00:07:10.086 sys 0m0.260s 00:07:10.086 10:38:26 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:10.086 10:38:26 -- common/autotest_common.sh@10 -- # set +x 00:07:10.086 ************************************ 00:07:10.086 END TEST accel_xor 00:07:10.086 ************************************ 00:07:10.086 10:38:26 -- accel/accel.sh@103 -- # run_test accel_dif_verify accel_test -t 1 -w dif_verify 00:07:10.086 10:38:26 -- common/autotest_common.sh@1077 -- # '[' 6 -le 1 ']' 00:07:10.086 10:38:26 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:10.086 10:38:26 -- common/autotest_common.sh@10 -- # set +x 00:07:10.086 ************************************ 00:07:10.086 START TEST accel_dif_verify 00:07:10.086 ************************************ 00:07:10.086 10:38:26 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w dif_verify 00:07:10.086 10:38:26 -- accel/accel.sh@16 -- # local accel_opc 00:07:10.086 10:38:26 -- accel/accel.sh@17 -- # local accel_module 00:07:10.086 10:38:26 -- accel/accel.sh@18 -- # accel_perf -t 1 -w dif_verify 00:07:10.086 10:38:26 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_verify 00:07:10.086 10:38:26 -- accel/accel.sh@12 -- # build_accel_config 00:07:10.086 10:38:26 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:10.086 10:38:26 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:10.086 10:38:26 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:10.086 10:38:26 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:10.086 10:38:26 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:10.086 10:38:26 -- accel/accel.sh@41 -- # local IFS=, 00:07:10.086 10:38:26 -- accel/accel.sh@42 -- # jq -r . 00:07:10.086 [2024-07-13 10:38:26.130518] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:07:10.086 [2024-07-13 10:38:26.130601] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1983010 ] 00:07:10.086 EAL: No free 2048 kB hugepages reported on node 1 00:07:10.086 [2024-07-13 10:38:26.200097] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:10.086 [2024-07-13 10:38:26.235394] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:11.023 10:38:27 -- accel/accel.sh@18 -- # out=' 00:07:11.023 SPDK Configuration: 00:07:11.023 Core mask: 0x1 00:07:11.023 00:07:11.023 Accel Perf Configuration: 00:07:11.023 Workload Type: dif_verify 00:07:11.023 Vector size: 4096 bytes 00:07:11.023 Transfer size: 4096 bytes 00:07:11.023 Block size: 512 bytes 00:07:11.023 Metadata size: 8 bytes 00:07:11.023 Vector count 1 00:07:11.023 Module: software 00:07:11.023 Queue depth: 32 00:07:11.023 Allocate depth: 32 00:07:11.023 # threads/core: 1 00:07:11.023 Run time: 1 seconds 00:07:11.023 Verify: No 00:07:11.023 00:07:11.023 Running for 1 seconds... 00:07:11.023 00:07:11.023 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:11.023 ------------------------------------------------------------------------------------ 00:07:11.023 0,0 241248/s 957 MiB/s 0 0 00:07:11.023 ==================================================================================== 00:07:11.023 Total 241248/s 942 MiB/s 0 0' 00:07:11.023 10:38:27 -- accel/accel.sh@20 -- # IFS=: 00:07:11.023 10:38:27 -- accel/accel.sh@20 -- # read -r var val 00:07:11.023 10:38:27 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_verify 00:07:11.023 10:38:27 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_verify 00:07:11.023 10:38:27 -- accel/accel.sh@12 -- # build_accel_config 00:07:11.023 10:38:27 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:11.023 10:38:27 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:11.023 10:38:27 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:11.023 10:38:27 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:11.023 10:38:27 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:11.023 10:38:27 -- accel/accel.sh@41 -- # local IFS=, 00:07:11.023 10:38:27 -- accel/accel.sh@42 -- # jq -r . 00:07:11.282 [2024-07-13 10:38:27.416180] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:07:11.282 [2024-07-13 10:38:27.416275] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1983208 ] 00:07:11.282 EAL: No free 2048 kB hugepages reported on node 1 00:07:11.282 [2024-07-13 10:38:27.487493] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:11.282 [2024-07-13 10:38:27.522747] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:11.282 10:38:27 -- accel/accel.sh@21 -- # val= 00:07:11.282 10:38:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.282 10:38:27 -- accel/accel.sh@20 -- # IFS=: 00:07:11.282 10:38:27 -- accel/accel.sh@20 -- # read -r var val 00:07:11.282 10:38:27 -- accel/accel.sh@21 -- # val= 00:07:11.282 10:38:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.282 10:38:27 -- accel/accel.sh@20 -- # IFS=: 00:07:11.282 10:38:27 -- accel/accel.sh@20 -- # read -r var val 00:07:11.282 10:38:27 -- accel/accel.sh@21 -- # val=0x1 00:07:11.282 10:38:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.282 10:38:27 -- accel/accel.sh@20 -- # IFS=: 00:07:11.282 10:38:27 -- accel/accel.sh@20 -- # read -r var val 00:07:11.282 10:38:27 -- accel/accel.sh@21 -- # val= 00:07:11.282 10:38:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.282 10:38:27 -- accel/accel.sh@20 -- # IFS=: 00:07:11.282 10:38:27 -- accel/accel.sh@20 -- # read -r var val 00:07:11.282 10:38:27 -- accel/accel.sh@21 -- # val= 00:07:11.282 10:38:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.282 10:38:27 -- accel/accel.sh@20 -- # IFS=: 00:07:11.282 10:38:27 -- accel/accel.sh@20 -- # read -r var val 00:07:11.282 10:38:27 -- accel/accel.sh@21 -- # val=dif_verify 00:07:11.282 10:38:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.282 10:38:27 -- accel/accel.sh@24 -- # accel_opc=dif_verify 00:07:11.282 10:38:27 -- accel/accel.sh@20 -- # IFS=: 00:07:11.282 10:38:27 -- accel/accel.sh@20 -- # read -r var val 00:07:11.282 10:38:27 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:11.282 10:38:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.282 10:38:27 -- accel/accel.sh@20 -- # IFS=: 00:07:11.282 10:38:27 -- accel/accel.sh@20 -- # read -r var val 00:07:11.282 10:38:27 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:11.282 10:38:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.282 10:38:27 -- accel/accel.sh@20 -- # IFS=: 00:07:11.282 10:38:27 -- accel/accel.sh@20 -- # read -r var val 00:07:11.282 10:38:27 -- accel/accel.sh@21 -- # val='512 bytes' 00:07:11.282 10:38:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.282 10:38:27 -- accel/accel.sh@20 -- # IFS=: 00:07:11.282 10:38:27 -- accel/accel.sh@20 -- # read -r var val 00:07:11.282 10:38:27 -- accel/accel.sh@21 -- # val='8 bytes' 00:07:11.282 10:38:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.282 10:38:27 -- accel/accel.sh@20 -- # IFS=: 00:07:11.282 10:38:27 -- accel/accel.sh@20 -- # read -r var val 00:07:11.282 10:38:27 -- accel/accel.sh@21 -- # val= 00:07:11.282 10:38:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.282 10:38:27 -- accel/accel.sh@20 -- # IFS=: 00:07:11.282 10:38:27 -- accel/accel.sh@20 -- # read -r var val 00:07:11.282 10:38:27 -- accel/accel.sh@21 -- # val=software 00:07:11.282 10:38:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.282 10:38:27 -- accel/accel.sh@23 -- # accel_module=software 00:07:11.282 10:38:27 -- accel/accel.sh@20 -- # IFS=: 00:07:11.282 10:38:27 -- accel/accel.sh@20 -- # read -r var val 00:07:11.282 10:38:27 -- accel/accel.sh@21 -- # val=32 00:07:11.282 10:38:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.282 10:38:27 -- accel/accel.sh@20 -- # IFS=: 00:07:11.282 10:38:27 -- accel/accel.sh@20 -- # read -r var val 00:07:11.282 10:38:27 -- accel/accel.sh@21 -- # val=32 00:07:11.282 10:38:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.282 10:38:27 -- accel/accel.sh@20 -- # IFS=: 00:07:11.282 10:38:27 -- accel/accel.sh@20 -- # read -r var val 00:07:11.282 10:38:27 -- accel/accel.sh@21 -- # val=1 00:07:11.282 10:38:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.282 10:38:27 -- accel/accel.sh@20 -- # IFS=: 00:07:11.282 10:38:27 -- accel/accel.sh@20 -- # read -r var val 00:07:11.282 10:38:27 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:11.282 10:38:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.282 10:38:27 -- accel/accel.sh@20 -- # IFS=: 00:07:11.282 10:38:27 -- accel/accel.sh@20 -- # read -r var val 00:07:11.282 10:38:27 -- accel/accel.sh@21 -- # val=No 00:07:11.282 10:38:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.282 10:38:27 -- accel/accel.sh@20 -- # IFS=: 00:07:11.282 10:38:27 -- accel/accel.sh@20 -- # read -r var val 00:07:11.282 10:38:27 -- accel/accel.sh@21 -- # val= 00:07:11.282 10:38:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.282 10:38:27 -- accel/accel.sh@20 -- # IFS=: 00:07:11.282 10:38:27 -- accel/accel.sh@20 -- # read -r var val 00:07:11.282 10:38:27 -- accel/accel.sh@21 -- # val= 00:07:11.282 10:38:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.282 10:38:27 -- accel/accel.sh@20 -- # IFS=: 00:07:11.282 10:38:27 -- accel/accel.sh@20 -- # read -r var val 00:07:12.661 10:38:28 -- accel/accel.sh@21 -- # val= 00:07:12.661 10:38:28 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.661 10:38:28 -- accel/accel.sh@20 -- # IFS=: 00:07:12.661 10:38:28 -- accel/accel.sh@20 -- # read -r var val 00:07:12.661 10:38:28 -- accel/accel.sh@21 -- # val= 00:07:12.661 10:38:28 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.661 10:38:28 -- accel/accel.sh@20 -- # IFS=: 00:07:12.661 10:38:28 -- accel/accel.sh@20 -- # read -r var val 00:07:12.661 10:38:28 -- accel/accel.sh@21 -- # val= 00:07:12.661 10:38:28 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.661 10:38:28 -- accel/accel.sh@20 -- # IFS=: 00:07:12.661 10:38:28 -- accel/accel.sh@20 -- # read -r var val 00:07:12.661 10:38:28 -- accel/accel.sh@21 -- # val= 00:07:12.661 10:38:28 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.661 10:38:28 -- accel/accel.sh@20 -- # IFS=: 00:07:12.661 10:38:28 -- accel/accel.sh@20 -- # read -r var val 00:07:12.661 10:38:28 -- accel/accel.sh@21 -- # val= 00:07:12.661 10:38:28 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.661 10:38:28 -- accel/accel.sh@20 -- # IFS=: 00:07:12.661 10:38:28 -- accel/accel.sh@20 -- # read -r var val 00:07:12.661 10:38:28 -- accel/accel.sh@21 -- # val= 00:07:12.661 10:38:28 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.661 10:38:28 -- accel/accel.sh@20 -- # IFS=: 00:07:12.661 10:38:28 -- accel/accel.sh@20 -- # read -r var val 00:07:12.661 10:38:28 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:12.661 10:38:28 -- accel/accel.sh@28 -- # [[ -n dif_verify ]] 00:07:12.661 10:38:28 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:12.661 00:07:12.661 real 0m2.579s 00:07:12.661 user 0m2.331s 00:07:12.661 sys 0m0.258s 00:07:12.661 10:38:28 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:12.661 10:38:28 -- common/autotest_common.sh@10 -- # set +x 00:07:12.661 ************************************ 00:07:12.661 END TEST accel_dif_verify 00:07:12.661 ************************************ 00:07:12.661 10:38:28 -- accel/accel.sh@104 -- # run_test accel_dif_generate accel_test -t 1 -w dif_generate 00:07:12.661 10:38:28 -- common/autotest_common.sh@1077 -- # '[' 6 -le 1 ']' 00:07:12.661 10:38:28 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:12.661 10:38:28 -- common/autotest_common.sh@10 -- # set +x 00:07:12.661 ************************************ 00:07:12.661 START TEST accel_dif_generate 00:07:12.661 ************************************ 00:07:12.661 10:38:28 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w dif_generate 00:07:12.661 10:38:28 -- accel/accel.sh@16 -- # local accel_opc 00:07:12.661 10:38:28 -- accel/accel.sh@17 -- # local accel_module 00:07:12.661 10:38:28 -- accel/accel.sh@18 -- # accel_perf -t 1 -w dif_generate 00:07:12.661 10:38:28 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate 00:07:12.661 10:38:28 -- accel/accel.sh@12 -- # build_accel_config 00:07:12.661 10:38:28 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:12.661 10:38:28 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:12.661 10:38:28 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:12.661 10:38:28 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:12.661 10:38:28 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:12.661 10:38:28 -- accel/accel.sh@41 -- # local IFS=, 00:07:12.661 10:38:28 -- accel/accel.sh@42 -- # jq -r . 00:07:12.661 [2024-07-13 10:38:28.758266] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:07:12.661 [2024-07-13 10:38:28.758377] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1983498 ] 00:07:12.661 EAL: No free 2048 kB hugepages reported on node 1 00:07:12.661 [2024-07-13 10:38:28.826649] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:12.661 [2024-07-13 10:38:28.861758] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:14.039 10:38:30 -- accel/accel.sh@18 -- # out=' 00:07:14.039 SPDK Configuration: 00:07:14.039 Core mask: 0x1 00:07:14.039 00:07:14.039 Accel Perf Configuration: 00:07:14.039 Workload Type: dif_generate 00:07:14.039 Vector size: 4096 bytes 00:07:14.039 Transfer size: 4096 bytes 00:07:14.039 Block size: 512 bytes 00:07:14.039 Metadata size: 8 bytes 00:07:14.039 Vector count 1 00:07:14.039 Module: software 00:07:14.039 Queue depth: 32 00:07:14.039 Allocate depth: 32 00:07:14.039 # threads/core: 1 00:07:14.039 Run time: 1 seconds 00:07:14.039 Verify: No 00:07:14.039 00:07:14.039 Running for 1 seconds... 00:07:14.039 00:07:14.039 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:14.039 ------------------------------------------------------------------------------------ 00:07:14.039 0,0 293824/s 1165 MiB/s 0 0 00:07:14.039 ==================================================================================== 00:07:14.039 Total 293824/s 1147 MiB/s 0 0' 00:07:14.039 10:38:30 -- accel/accel.sh@20 -- # IFS=: 00:07:14.039 10:38:30 -- accel/accel.sh@20 -- # read -r var val 00:07:14.039 10:38:30 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate 00:07:14.039 10:38:30 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate 00:07:14.039 10:38:30 -- accel/accel.sh@12 -- # build_accel_config 00:07:14.039 10:38:30 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:14.039 10:38:30 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:14.039 10:38:30 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:14.039 10:38:30 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:14.039 10:38:30 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:14.039 10:38:30 -- accel/accel.sh@41 -- # local IFS=, 00:07:14.039 10:38:30 -- accel/accel.sh@42 -- # jq -r . 00:07:14.039 [2024-07-13 10:38:30.044348] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:07:14.039 [2024-07-13 10:38:30.044452] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1983766 ] 00:07:14.039 EAL: No free 2048 kB hugepages reported on node 1 00:07:14.039 [2024-07-13 10:38:30.114949] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:14.039 [2024-07-13 10:38:30.149849] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:14.039 10:38:30 -- accel/accel.sh@21 -- # val= 00:07:14.039 10:38:30 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.039 10:38:30 -- accel/accel.sh@20 -- # IFS=: 00:07:14.039 10:38:30 -- accel/accel.sh@20 -- # read -r var val 00:07:14.039 10:38:30 -- accel/accel.sh@21 -- # val= 00:07:14.039 10:38:30 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.039 10:38:30 -- accel/accel.sh@20 -- # IFS=: 00:07:14.039 10:38:30 -- accel/accel.sh@20 -- # read -r var val 00:07:14.039 10:38:30 -- accel/accel.sh@21 -- # val=0x1 00:07:14.039 10:38:30 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.039 10:38:30 -- accel/accel.sh@20 -- # IFS=: 00:07:14.039 10:38:30 -- accel/accel.sh@20 -- # read -r var val 00:07:14.039 10:38:30 -- accel/accel.sh@21 -- # val= 00:07:14.039 10:38:30 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.039 10:38:30 -- accel/accel.sh@20 -- # IFS=: 00:07:14.039 10:38:30 -- accel/accel.sh@20 -- # read -r var val 00:07:14.039 10:38:30 -- accel/accel.sh@21 -- # val= 00:07:14.039 10:38:30 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.039 10:38:30 -- accel/accel.sh@20 -- # IFS=: 00:07:14.039 10:38:30 -- accel/accel.sh@20 -- # read -r var val 00:07:14.039 10:38:30 -- accel/accel.sh@21 -- # val=dif_generate 00:07:14.039 10:38:30 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.039 10:38:30 -- accel/accel.sh@24 -- # accel_opc=dif_generate 00:07:14.039 10:38:30 -- accel/accel.sh@20 -- # IFS=: 00:07:14.039 10:38:30 -- accel/accel.sh@20 -- # read -r var val 00:07:14.039 10:38:30 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:14.039 10:38:30 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.039 10:38:30 -- accel/accel.sh@20 -- # IFS=: 00:07:14.039 10:38:30 -- accel/accel.sh@20 -- # read -r var val 00:07:14.039 10:38:30 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:14.039 10:38:30 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.039 10:38:30 -- accel/accel.sh@20 -- # IFS=: 00:07:14.039 10:38:30 -- accel/accel.sh@20 -- # read -r var val 00:07:14.039 10:38:30 -- accel/accel.sh@21 -- # val='512 bytes' 00:07:14.039 10:38:30 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.039 10:38:30 -- accel/accel.sh@20 -- # IFS=: 00:07:14.039 10:38:30 -- accel/accel.sh@20 -- # read -r var val 00:07:14.039 10:38:30 -- accel/accel.sh@21 -- # val='8 bytes' 00:07:14.039 10:38:30 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.039 10:38:30 -- accel/accel.sh@20 -- # IFS=: 00:07:14.039 10:38:30 -- accel/accel.sh@20 -- # read -r var val 00:07:14.039 10:38:30 -- accel/accel.sh@21 -- # val= 00:07:14.039 10:38:30 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.039 10:38:30 -- accel/accel.sh@20 -- # IFS=: 00:07:14.039 10:38:30 -- accel/accel.sh@20 -- # read -r var val 00:07:14.039 10:38:30 -- accel/accel.sh@21 -- # val=software 00:07:14.039 10:38:30 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.039 10:38:30 -- accel/accel.sh@23 -- # accel_module=software 00:07:14.039 10:38:30 -- accel/accel.sh@20 -- # IFS=: 00:07:14.039 10:38:30 -- accel/accel.sh@20 -- # read -r var val 00:07:14.039 10:38:30 -- accel/accel.sh@21 -- # val=32 00:07:14.039 10:38:30 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.039 10:38:30 -- accel/accel.sh@20 -- # IFS=: 00:07:14.039 10:38:30 -- accel/accel.sh@20 -- # read -r var val 00:07:14.039 10:38:30 -- accel/accel.sh@21 -- # val=32 00:07:14.039 10:38:30 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.039 10:38:30 -- accel/accel.sh@20 -- # IFS=: 00:07:14.039 10:38:30 -- accel/accel.sh@20 -- # read -r var val 00:07:14.039 10:38:30 -- accel/accel.sh@21 -- # val=1 00:07:14.039 10:38:30 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.039 10:38:30 -- accel/accel.sh@20 -- # IFS=: 00:07:14.039 10:38:30 -- accel/accel.sh@20 -- # read -r var val 00:07:14.039 10:38:30 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:14.039 10:38:30 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.039 10:38:30 -- accel/accel.sh@20 -- # IFS=: 00:07:14.039 10:38:30 -- accel/accel.sh@20 -- # read -r var val 00:07:14.039 10:38:30 -- accel/accel.sh@21 -- # val=No 00:07:14.039 10:38:30 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.039 10:38:30 -- accel/accel.sh@20 -- # IFS=: 00:07:14.039 10:38:30 -- accel/accel.sh@20 -- # read -r var val 00:07:14.039 10:38:30 -- accel/accel.sh@21 -- # val= 00:07:14.039 10:38:30 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.039 10:38:30 -- accel/accel.sh@20 -- # IFS=: 00:07:14.039 10:38:30 -- accel/accel.sh@20 -- # read -r var val 00:07:14.039 10:38:30 -- accel/accel.sh@21 -- # val= 00:07:14.039 10:38:30 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.039 10:38:30 -- accel/accel.sh@20 -- # IFS=: 00:07:14.039 10:38:30 -- accel/accel.sh@20 -- # read -r var val 00:07:14.977 10:38:31 -- accel/accel.sh@21 -- # val= 00:07:14.977 10:38:31 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.977 10:38:31 -- accel/accel.sh@20 -- # IFS=: 00:07:14.977 10:38:31 -- accel/accel.sh@20 -- # read -r var val 00:07:14.977 10:38:31 -- accel/accel.sh@21 -- # val= 00:07:14.977 10:38:31 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.977 10:38:31 -- accel/accel.sh@20 -- # IFS=: 00:07:14.977 10:38:31 -- accel/accel.sh@20 -- # read -r var val 00:07:14.977 10:38:31 -- accel/accel.sh@21 -- # val= 00:07:14.977 10:38:31 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.977 10:38:31 -- accel/accel.sh@20 -- # IFS=: 00:07:14.977 10:38:31 -- accel/accel.sh@20 -- # read -r var val 00:07:14.977 10:38:31 -- accel/accel.sh@21 -- # val= 00:07:14.977 10:38:31 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.977 10:38:31 -- accel/accel.sh@20 -- # IFS=: 00:07:14.977 10:38:31 -- accel/accel.sh@20 -- # read -r var val 00:07:14.977 10:38:31 -- accel/accel.sh@21 -- # val= 00:07:14.977 10:38:31 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.977 10:38:31 -- accel/accel.sh@20 -- # IFS=: 00:07:14.977 10:38:31 -- accel/accel.sh@20 -- # read -r var val 00:07:14.977 10:38:31 -- accel/accel.sh@21 -- # val= 00:07:14.977 10:38:31 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.977 10:38:31 -- accel/accel.sh@20 -- # IFS=: 00:07:14.977 10:38:31 -- accel/accel.sh@20 -- # read -r var val 00:07:14.977 10:38:31 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:14.977 10:38:31 -- accel/accel.sh@28 -- # [[ -n dif_generate ]] 00:07:14.977 10:38:31 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:14.977 00:07:14.977 real 0m2.578s 00:07:14.977 user 0m2.329s 00:07:14.977 sys 0m0.260s 00:07:14.977 10:38:31 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:14.977 10:38:31 -- common/autotest_common.sh@10 -- # set +x 00:07:14.977 ************************************ 00:07:14.977 END TEST accel_dif_generate 00:07:14.977 ************************************ 00:07:14.977 10:38:31 -- accel/accel.sh@105 -- # run_test accel_dif_generate_copy accel_test -t 1 -w dif_generate_copy 00:07:14.977 10:38:31 -- common/autotest_common.sh@1077 -- # '[' 6 -le 1 ']' 00:07:14.977 10:38:31 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:14.977 10:38:31 -- common/autotest_common.sh@10 -- # set +x 00:07:14.977 ************************************ 00:07:14.977 START TEST accel_dif_generate_copy 00:07:14.977 ************************************ 00:07:14.977 10:38:31 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w dif_generate_copy 00:07:14.977 10:38:31 -- accel/accel.sh@16 -- # local accel_opc 00:07:15.236 10:38:31 -- accel/accel.sh@17 -- # local accel_module 00:07:15.236 10:38:31 -- accel/accel.sh@18 -- # accel_perf -t 1 -w dif_generate_copy 00:07:15.236 10:38:31 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate_copy 00:07:15.236 10:38:31 -- accel/accel.sh@12 -- # build_accel_config 00:07:15.236 10:38:31 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:15.236 10:38:31 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:15.236 10:38:31 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:15.236 10:38:31 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:15.236 10:38:31 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:15.236 10:38:31 -- accel/accel.sh@41 -- # local IFS=, 00:07:15.236 10:38:31 -- accel/accel.sh@42 -- # jq -r . 00:07:15.236 [2024-07-13 10:38:31.383766] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:07:15.236 [2024-07-13 10:38:31.383857] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1984048 ] 00:07:15.236 EAL: No free 2048 kB hugepages reported on node 1 00:07:15.236 [2024-07-13 10:38:31.453270] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:15.236 [2024-07-13 10:38:31.488539] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:16.616 10:38:32 -- accel/accel.sh@18 -- # out=' 00:07:16.616 SPDK Configuration: 00:07:16.616 Core mask: 0x1 00:07:16.616 00:07:16.616 Accel Perf Configuration: 00:07:16.616 Workload Type: dif_generate_copy 00:07:16.616 Vector size: 4096 bytes 00:07:16.616 Transfer size: 4096 bytes 00:07:16.616 Vector count 1 00:07:16.616 Module: software 00:07:16.616 Queue depth: 32 00:07:16.616 Allocate depth: 32 00:07:16.616 # threads/core: 1 00:07:16.616 Run time: 1 seconds 00:07:16.616 Verify: No 00:07:16.616 00:07:16.616 Running for 1 seconds... 00:07:16.616 00:07:16.616 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:16.616 ------------------------------------------------------------------------------------ 00:07:16.616 0,0 227360/s 902 MiB/s 0 0 00:07:16.616 ==================================================================================== 00:07:16.616 Total 227360/s 888 MiB/s 0 0' 00:07:16.616 10:38:32 -- accel/accel.sh@20 -- # IFS=: 00:07:16.616 10:38:32 -- accel/accel.sh@20 -- # read -r var val 00:07:16.616 10:38:32 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate_copy 00:07:16.616 10:38:32 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate_copy 00:07:16.616 10:38:32 -- accel/accel.sh@12 -- # build_accel_config 00:07:16.616 10:38:32 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:16.616 10:38:32 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:16.616 10:38:32 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:16.616 10:38:32 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:16.616 10:38:32 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:16.616 10:38:32 -- accel/accel.sh@41 -- # local IFS=, 00:07:16.616 10:38:32 -- accel/accel.sh@42 -- # jq -r . 00:07:16.616 [2024-07-13 10:38:32.667147] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:07:16.616 [2024-07-13 10:38:32.667239] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1984322 ] 00:07:16.616 EAL: No free 2048 kB hugepages reported on node 1 00:07:16.616 [2024-07-13 10:38:32.735054] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:16.616 [2024-07-13 10:38:32.768988] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:16.616 10:38:32 -- accel/accel.sh@21 -- # val= 00:07:16.616 10:38:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.616 10:38:32 -- accel/accel.sh@20 -- # IFS=: 00:07:16.616 10:38:32 -- accel/accel.sh@20 -- # read -r var val 00:07:16.616 10:38:32 -- accel/accel.sh@21 -- # val= 00:07:16.616 10:38:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.616 10:38:32 -- accel/accel.sh@20 -- # IFS=: 00:07:16.616 10:38:32 -- accel/accel.sh@20 -- # read -r var val 00:07:16.616 10:38:32 -- accel/accel.sh@21 -- # val=0x1 00:07:16.616 10:38:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.616 10:38:32 -- accel/accel.sh@20 -- # IFS=: 00:07:16.616 10:38:32 -- accel/accel.sh@20 -- # read -r var val 00:07:16.616 10:38:32 -- accel/accel.sh@21 -- # val= 00:07:16.616 10:38:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.616 10:38:32 -- accel/accel.sh@20 -- # IFS=: 00:07:16.616 10:38:32 -- accel/accel.sh@20 -- # read -r var val 00:07:16.616 10:38:32 -- accel/accel.sh@21 -- # val= 00:07:16.616 10:38:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.616 10:38:32 -- accel/accel.sh@20 -- # IFS=: 00:07:16.616 10:38:32 -- accel/accel.sh@20 -- # read -r var val 00:07:16.616 10:38:32 -- accel/accel.sh@21 -- # val=dif_generate_copy 00:07:16.616 10:38:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.616 10:38:32 -- accel/accel.sh@24 -- # accel_opc=dif_generate_copy 00:07:16.616 10:38:32 -- accel/accel.sh@20 -- # IFS=: 00:07:16.616 10:38:32 -- accel/accel.sh@20 -- # read -r var val 00:07:16.616 10:38:32 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:16.616 10:38:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.616 10:38:32 -- accel/accel.sh@20 -- # IFS=: 00:07:16.616 10:38:32 -- accel/accel.sh@20 -- # read -r var val 00:07:16.616 10:38:32 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:16.616 10:38:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.616 10:38:32 -- accel/accel.sh@20 -- # IFS=: 00:07:16.616 10:38:32 -- accel/accel.sh@20 -- # read -r var val 00:07:16.616 10:38:32 -- accel/accel.sh@21 -- # val= 00:07:16.616 10:38:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.616 10:38:32 -- accel/accel.sh@20 -- # IFS=: 00:07:16.616 10:38:32 -- accel/accel.sh@20 -- # read -r var val 00:07:16.616 10:38:32 -- accel/accel.sh@21 -- # val=software 00:07:16.616 10:38:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.616 10:38:32 -- accel/accel.sh@23 -- # accel_module=software 00:07:16.616 10:38:32 -- accel/accel.sh@20 -- # IFS=: 00:07:16.616 10:38:32 -- accel/accel.sh@20 -- # read -r var val 00:07:16.616 10:38:32 -- accel/accel.sh@21 -- # val=32 00:07:16.616 10:38:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.616 10:38:32 -- accel/accel.sh@20 -- # IFS=: 00:07:16.616 10:38:32 -- accel/accel.sh@20 -- # read -r var val 00:07:16.616 10:38:32 -- accel/accel.sh@21 -- # val=32 00:07:16.616 10:38:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.616 10:38:32 -- accel/accel.sh@20 -- # IFS=: 00:07:16.616 10:38:32 -- accel/accel.sh@20 -- # read -r var val 00:07:16.616 10:38:32 -- accel/accel.sh@21 -- # val=1 00:07:16.616 10:38:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.616 10:38:32 -- accel/accel.sh@20 -- # IFS=: 00:07:16.616 10:38:32 -- accel/accel.sh@20 -- # read -r var val 00:07:16.616 10:38:32 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:16.616 10:38:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.616 10:38:32 -- accel/accel.sh@20 -- # IFS=: 00:07:16.616 10:38:32 -- accel/accel.sh@20 -- # read -r var val 00:07:16.616 10:38:32 -- accel/accel.sh@21 -- # val=No 00:07:16.616 10:38:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.616 10:38:32 -- accel/accel.sh@20 -- # IFS=: 00:07:16.616 10:38:32 -- accel/accel.sh@20 -- # read -r var val 00:07:16.616 10:38:32 -- accel/accel.sh@21 -- # val= 00:07:16.616 10:38:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.616 10:38:32 -- accel/accel.sh@20 -- # IFS=: 00:07:16.616 10:38:32 -- accel/accel.sh@20 -- # read -r var val 00:07:16.616 10:38:32 -- accel/accel.sh@21 -- # val= 00:07:16.616 10:38:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.616 10:38:32 -- accel/accel.sh@20 -- # IFS=: 00:07:16.616 10:38:32 -- accel/accel.sh@20 -- # read -r var val 00:07:17.549 10:38:33 -- accel/accel.sh@21 -- # val= 00:07:17.549 10:38:33 -- accel/accel.sh@22 -- # case "$var" in 00:07:17.549 10:38:33 -- accel/accel.sh@20 -- # IFS=: 00:07:17.550 10:38:33 -- accel/accel.sh@20 -- # read -r var val 00:07:17.550 10:38:33 -- accel/accel.sh@21 -- # val= 00:07:17.550 10:38:33 -- accel/accel.sh@22 -- # case "$var" in 00:07:17.550 10:38:33 -- accel/accel.sh@20 -- # IFS=: 00:07:17.550 10:38:33 -- accel/accel.sh@20 -- # read -r var val 00:07:17.550 10:38:33 -- accel/accel.sh@21 -- # val= 00:07:17.550 10:38:33 -- accel/accel.sh@22 -- # case "$var" in 00:07:17.550 10:38:33 -- accel/accel.sh@20 -- # IFS=: 00:07:17.550 10:38:33 -- accel/accel.sh@20 -- # read -r var val 00:07:17.550 10:38:33 -- accel/accel.sh@21 -- # val= 00:07:17.550 10:38:33 -- accel/accel.sh@22 -- # case "$var" in 00:07:17.550 10:38:33 -- accel/accel.sh@20 -- # IFS=: 00:07:17.550 10:38:33 -- accel/accel.sh@20 -- # read -r var val 00:07:17.550 10:38:33 -- accel/accel.sh@21 -- # val= 00:07:17.550 10:38:33 -- accel/accel.sh@22 -- # case "$var" in 00:07:17.550 10:38:33 -- accel/accel.sh@20 -- # IFS=: 00:07:17.550 10:38:33 -- accel/accel.sh@20 -- # read -r var val 00:07:17.550 10:38:33 -- accel/accel.sh@21 -- # val= 00:07:17.550 10:38:33 -- accel/accel.sh@22 -- # case "$var" in 00:07:17.550 10:38:33 -- accel/accel.sh@20 -- # IFS=: 00:07:17.550 10:38:33 -- accel/accel.sh@20 -- # read -r var val 00:07:17.550 10:38:33 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:17.550 10:38:33 -- accel/accel.sh@28 -- # [[ -n dif_generate_copy ]] 00:07:17.550 10:38:33 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:17.550 00:07:17.550 real 0m2.572s 00:07:17.550 user 0m2.318s 00:07:17.550 sys 0m0.262s 00:07:17.550 10:38:33 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:17.550 10:38:33 -- common/autotest_common.sh@10 -- # set +x 00:07:17.550 ************************************ 00:07:17.550 END TEST accel_dif_generate_copy 00:07:17.550 ************************************ 00:07:17.808 10:38:33 -- accel/accel.sh@107 -- # [[ y == y ]] 00:07:17.808 10:38:33 -- accel/accel.sh@108 -- # run_test accel_comp accel_test -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:17.808 10:38:33 -- common/autotest_common.sh@1077 -- # '[' 8 -le 1 ']' 00:07:17.808 10:38:33 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:17.808 10:38:33 -- common/autotest_common.sh@10 -- # set +x 00:07:17.808 ************************************ 00:07:17.808 START TEST accel_comp 00:07:17.808 ************************************ 00:07:17.808 10:38:33 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:17.809 10:38:33 -- accel/accel.sh@16 -- # local accel_opc 00:07:17.809 10:38:33 -- accel/accel.sh@17 -- # local accel_module 00:07:17.809 10:38:33 -- accel/accel.sh@18 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:17.809 10:38:33 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:17.809 10:38:33 -- accel/accel.sh@12 -- # build_accel_config 00:07:17.809 10:38:33 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:17.809 10:38:33 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:17.809 10:38:33 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:17.809 10:38:33 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:17.809 10:38:33 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:17.809 10:38:33 -- accel/accel.sh@41 -- # local IFS=, 00:07:17.809 10:38:33 -- accel/accel.sh@42 -- # jq -r . 00:07:17.809 [2024-07-13 10:38:34.003193] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:07:17.809 [2024-07-13 10:38:34.003283] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1984510 ] 00:07:17.809 EAL: No free 2048 kB hugepages reported on node 1 00:07:17.809 [2024-07-13 10:38:34.071466] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:17.809 [2024-07-13 10:38:34.107057] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:19.186 10:38:35 -- accel/accel.sh@18 -- # out='Preparing input file... 00:07:19.187 00:07:19.187 SPDK Configuration: 00:07:19.187 Core mask: 0x1 00:07:19.187 00:07:19.187 Accel Perf Configuration: 00:07:19.187 Workload Type: compress 00:07:19.187 Transfer size: 4096 bytes 00:07:19.187 Vector count 1 00:07:19.187 Module: software 00:07:19.187 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:19.187 Queue depth: 32 00:07:19.187 Allocate depth: 32 00:07:19.187 # threads/core: 1 00:07:19.187 Run time: 1 seconds 00:07:19.187 Verify: No 00:07:19.187 00:07:19.187 Running for 1 seconds... 00:07:19.187 00:07:19.187 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:19.187 ------------------------------------------------------------------------------------ 00:07:19.187 0,0 67648/s 281 MiB/s 0 0 00:07:19.187 ==================================================================================== 00:07:19.187 Total 67648/s 264 MiB/s 0 0' 00:07:19.187 10:38:35 -- accel/accel.sh@20 -- # IFS=: 00:07:19.187 10:38:35 -- accel/accel.sh@20 -- # read -r var val 00:07:19.187 10:38:35 -- accel/accel.sh@15 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:19.187 10:38:35 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:19.187 10:38:35 -- accel/accel.sh@12 -- # build_accel_config 00:07:19.187 10:38:35 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:19.187 10:38:35 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:19.187 10:38:35 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:19.187 10:38:35 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:19.187 10:38:35 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:19.187 10:38:35 -- accel/accel.sh@41 -- # local IFS=, 00:07:19.187 10:38:35 -- accel/accel.sh@42 -- # jq -r . 00:07:19.187 [2024-07-13 10:38:35.290930] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:07:19.187 [2024-07-13 10:38:35.291015] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1984649 ] 00:07:19.187 EAL: No free 2048 kB hugepages reported on node 1 00:07:19.187 [2024-07-13 10:38:35.360147] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:19.187 [2024-07-13 10:38:35.395559] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:19.187 10:38:35 -- accel/accel.sh@21 -- # val= 00:07:19.187 10:38:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.187 10:38:35 -- accel/accel.sh@20 -- # IFS=: 00:07:19.187 10:38:35 -- accel/accel.sh@20 -- # read -r var val 00:07:19.187 10:38:35 -- accel/accel.sh@21 -- # val= 00:07:19.187 10:38:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.187 10:38:35 -- accel/accel.sh@20 -- # IFS=: 00:07:19.187 10:38:35 -- accel/accel.sh@20 -- # read -r var val 00:07:19.187 10:38:35 -- accel/accel.sh@21 -- # val= 00:07:19.187 10:38:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.187 10:38:35 -- accel/accel.sh@20 -- # IFS=: 00:07:19.187 10:38:35 -- accel/accel.sh@20 -- # read -r var val 00:07:19.187 10:38:35 -- accel/accel.sh@21 -- # val=0x1 00:07:19.187 10:38:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.187 10:38:35 -- accel/accel.sh@20 -- # IFS=: 00:07:19.187 10:38:35 -- accel/accel.sh@20 -- # read -r var val 00:07:19.187 10:38:35 -- accel/accel.sh@21 -- # val= 00:07:19.187 10:38:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.187 10:38:35 -- accel/accel.sh@20 -- # IFS=: 00:07:19.187 10:38:35 -- accel/accel.sh@20 -- # read -r var val 00:07:19.187 10:38:35 -- accel/accel.sh@21 -- # val= 00:07:19.187 10:38:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.187 10:38:35 -- accel/accel.sh@20 -- # IFS=: 00:07:19.187 10:38:35 -- accel/accel.sh@20 -- # read -r var val 00:07:19.187 10:38:35 -- accel/accel.sh@21 -- # val=compress 00:07:19.187 10:38:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.187 10:38:35 -- accel/accel.sh@24 -- # accel_opc=compress 00:07:19.187 10:38:35 -- accel/accel.sh@20 -- # IFS=: 00:07:19.187 10:38:35 -- accel/accel.sh@20 -- # read -r var val 00:07:19.187 10:38:35 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:19.187 10:38:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.187 10:38:35 -- accel/accel.sh@20 -- # IFS=: 00:07:19.187 10:38:35 -- accel/accel.sh@20 -- # read -r var val 00:07:19.187 10:38:35 -- accel/accel.sh@21 -- # val= 00:07:19.187 10:38:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.187 10:38:35 -- accel/accel.sh@20 -- # IFS=: 00:07:19.187 10:38:35 -- accel/accel.sh@20 -- # read -r var val 00:07:19.187 10:38:35 -- accel/accel.sh@21 -- # val=software 00:07:19.187 10:38:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.187 10:38:35 -- accel/accel.sh@23 -- # accel_module=software 00:07:19.187 10:38:35 -- accel/accel.sh@20 -- # IFS=: 00:07:19.187 10:38:35 -- accel/accel.sh@20 -- # read -r var val 00:07:19.187 10:38:35 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:19.187 10:38:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.187 10:38:35 -- accel/accel.sh@20 -- # IFS=: 00:07:19.187 10:38:35 -- accel/accel.sh@20 -- # read -r var val 00:07:19.187 10:38:35 -- accel/accel.sh@21 -- # val=32 00:07:19.187 10:38:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.187 10:38:35 -- accel/accel.sh@20 -- # IFS=: 00:07:19.187 10:38:35 -- accel/accel.sh@20 -- # read -r var val 00:07:19.187 10:38:35 -- accel/accel.sh@21 -- # val=32 00:07:19.187 10:38:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.187 10:38:35 -- accel/accel.sh@20 -- # IFS=: 00:07:19.187 10:38:35 -- accel/accel.sh@20 -- # read -r var val 00:07:19.187 10:38:35 -- accel/accel.sh@21 -- # val=1 00:07:19.187 10:38:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.187 10:38:35 -- accel/accel.sh@20 -- # IFS=: 00:07:19.187 10:38:35 -- accel/accel.sh@20 -- # read -r var val 00:07:19.187 10:38:35 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:19.187 10:38:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.187 10:38:35 -- accel/accel.sh@20 -- # IFS=: 00:07:19.187 10:38:35 -- accel/accel.sh@20 -- # read -r var val 00:07:19.187 10:38:35 -- accel/accel.sh@21 -- # val=No 00:07:19.187 10:38:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.187 10:38:35 -- accel/accel.sh@20 -- # IFS=: 00:07:19.187 10:38:35 -- accel/accel.sh@20 -- # read -r var val 00:07:19.187 10:38:35 -- accel/accel.sh@21 -- # val= 00:07:19.187 10:38:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.187 10:38:35 -- accel/accel.sh@20 -- # IFS=: 00:07:19.187 10:38:35 -- accel/accel.sh@20 -- # read -r var val 00:07:19.187 10:38:35 -- accel/accel.sh@21 -- # val= 00:07:19.187 10:38:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.187 10:38:35 -- accel/accel.sh@20 -- # IFS=: 00:07:19.187 10:38:35 -- accel/accel.sh@20 -- # read -r var val 00:07:20.184 10:38:36 -- accel/accel.sh@21 -- # val= 00:07:20.184 10:38:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:20.184 10:38:36 -- accel/accel.sh@20 -- # IFS=: 00:07:20.184 10:38:36 -- accel/accel.sh@20 -- # read -r var val 00:07:20.184 10:38:36 -- accel/accel.sh@21 -- # val= 00:07:20.184 10:38:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:20.184 10:38:36 -- accel/accel.sh@20 -- # IFS=: 00:07:20.184 10:38:36 -- accel/accel.sh@20 -- # read -r var val 00:07:20.184 10:38:36 -- accel/accel.sh@21 -- # val= 00:07:20.184 10:38:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:20.184 10:38:36 -- accel/accel.sh@20 -- # IFS=: 00:07:20.184 10:38:36 -- accel/accel.sh@20 -- # read -r var val 00:07:20.184 10:38:36 -- accel/accel.sh@21 -- # val= 00:07:20.184 10:38:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:20.184 10:38:36 -- accel/accel.sh@20 -- # IFS=: 00:07:20.184 10:38:36 -- accel/accel.sh@20 -- # read -r var val 00:07:20.184 10:38:36 -- accel/accel.sh@21 -- # val= 00:07:20.184 10:38:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:20.184 10:38:36 -- accel/accel.sh@20 -- # IFS=: 00:07:20.184 10:38:36 -- accel/accel.sh@20 -- # read -r var val 00:07:20.184 10:38:36 -- accel/accel.sh@21 -- # val= 00:07:20.184 10:38:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:20.184 10:38:36 -- accel/accel.sh@20 -- # IFS=: 00:07:20.184 10:38:36 -- accel/accel.sh@20 -- # read -r var val 00:07:20.184 10:38:36 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:20.184 10:38:36 -- accel/accel.sh@28 -- # [[ -n compress ]] 00:07:20.184 10:38:36 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:20.184 00:07:20.184 real 0m2.580s 00:07:20.184 user 0m2.333s 00:07:20.184 sys 0m0.258s 00:07:20.184 10:38:36 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:20.184 10:38:36 -- common/autotest_common.sh@10 -- # set +x 00:07:20.184 ************************************ 00:07:20.184 END TEST accel_comp 00:07:20.184 ************************************ 00:07:20.444 10:38:36 -- accel/accel.sh@109 -- # run_test accel_decomp accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:20.444 10:38:36 -- common/autotest_common.sh@1077 -- # '[' 9 -le 1 ']' 00:07:20.444 10:38:36 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:20.444 10:38:36 -- common/autotest_common.sh@10 -- # set +x 00:07:20.444 ************************************ 00:07:20.444 START TEST accel_decomp 00:07:20.444 ************************************ 00:07:20.444 10:38:36 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:20.444 10:38:36 -- accel/accel.sh@16 -- # local accel_opc 00:07:20.444 10:38:36 -- accel/accel.sh@17 -- # local accel_module 00:07:20.444 10:38:36 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:20.444 10:38:36 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:20.444 10:38:36 -- accel/accel.sh@12 -- # build_accel_config 00:07:20.444 10:38:36 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:20.444 10:38:36 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:20.444 10:38:36 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:20.444 10:38:36 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:20.444 10:38:36 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:20.444 10:38:36 -- accel/accel.sh@41 -- # local IFS=, 00:07:20.444 10:38:36 -- accel/accel.sh@42 -- # jq -r . 00:07:20.444 [2024-07-13 10:38:36.632338] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:07:20.444 [2024-07-13 10:38:36.632436] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1984912 ] 00:07:20.444 EAL: No free 2048 kB hugepages reported on node 1 00:07:20.444 [2024-07-13 10:38:36.700719] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:20.444 [2024-07-13 10:38:36.736188] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:21.825 10:38:37 -- accel/accel.sh@18 -- # out='Preparing input file... 00:07:21.825 00:07:21.825 SPDK Configuration: 00:07:21.825 Core mask: 0x1 00:07:21.825 00:07:21.825 Accel Perf Configuration: 00:07:21.825 Workload Type: decompress 00:07:21.825 Transfer size: 4096 bytes 00:07:21.825 Vector count 1 00:07:21.825 Module: software 00:07:21.825 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:21.825 Queue depth: 32 00:07:21.825 Allocate depth: 32 00:07:21.825 # threads/core: 1 00:07:21.825 Run time: 1 seconds 00:07:21.825 Verify: Yes 00:07:21.825 00:07:21.825 Running for 1 seconds... 00:07:21.825 00:07:21.825 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:21.825 ------------------------------------------------------------------------------------ 00:07:21.825 0,0 92416/s 170 MiB/s 0 0 00:07:21.825 ==================================================================================== 00:07:21.825 Total 92416/s 361 MiB/s 0 0' 00:07:21.825 10:38:37 -- accel/accel.sh@20 -- # IFS=: 00:07:21.825 10:38:37 -- accel/accel.sh@20 -- # read -r var val 00:07:21.825 10:38:37 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:21.825 10:38:37 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:21.825 10:38:37 -- accel/accel.sh@12 -- # build_accel_config 00:07:21.825 10:38:37 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:21.825 10:38:37 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:21.825 10:38:37 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:21.825 10:38:37 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:21.825 10:38:37 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:21.825 10:38:37 -- accel/accel.sh@41 -- # local IFS=, 00:07:21.825 10:38:37 -- accel/accel.sh@42 -- # jq -r . 00:07:21.825 [2024-07-13 10:38:37.916364] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:07:21.825 [2024-07-13 10:38:37.916543] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1985185 ] 00:07:21.825 EAL: No free 2048 kB hugepages reported on node 1 00:07:21.825 [2024-07-13 10:38:37.984516] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:21.825 [2024-07-13 10:38:38.018462] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:21.825 10:38:38 -- accel/accel.sh@21 -- # val= 00:07:21.825 10:38:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.825 10:38:38 -- accel/accel.sh@20 -- # IFS=: 00:07:21.825 10:38:38 -- accel/accel.sh@20 -- # read -r var val 00:07:21.825 10:38:38 -- accel/accel.sh@21 -- # val= 00:07:21.825 10:38:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.825 10:38:38 -- accel/accel.sh@20 -- # IFS=: 00:07:21.825 10:38:38 -- accel/accel.sh@20 -- # read -r var val 00:07:21.825 10:38:38 -- accel/accel.sh@21 -- # val= 00:07:21.825 10:38:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.825 10:38:38 -- accel/accel.sh@20 -- # IFS=: 00:07:21.825 10:38:38 -- accel/accel.sh@20 -- # read -r var val 00:07:21.825 10:38:38 -- accel/accel.sh@21 -- # val=0x1 00:07:21.825 10:38:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.825 10:38:38 -- accel/accel.sh@20 -- # IFS=: 00:07:21.825 10:38:38 -- accel/accel.sh@20 -- # read -r var val 00:07:21.825 10:38:38 -- accel/accel.sh@21 -- # val= 00:07:21.825 10:38:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.825 10:38:38 -- accel/accel.sh@20 -- # IFS=: 00:07:21.825 10:38:38 -- accel/accel.sh@20 -- # read -r var val 00:07:21.825 10:38:38 -- accel/accel.sh@21 -- # val= 00:07:21.825 10:38:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.825 10:38:38 -- accel/accel.sh@20 -- # IFS=: 00:07:21.825 10:38:38 -- accel/accel.sh@20 -- # read -r var val 00:07:21.825 10:38:38 -- accel/accel.sh@21 -- # val=decompress 00:07:21.825 10:38:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.825 10:38:38 -- accel/accel.sh@24 -- # accel_opc=decompress 00:07:21.825 10:38:38 -- accel/accel.sh@20 -- # IFS=: 00:07:21.825 10:38:38 -- accel/accel.sh@20 -- # read -r var val 00:07:21.825 10:38:38 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:21.825 10:38:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.825 10:38:38 -- accel/accel.sh@20 -- # IFS=: 00:07:21.825 10:38:38 -- accel/accel.sh@20 -- # read -r var val 00:07:21.825 10:38:38 -- accel/accel.sh@21 -- # val= 00:07:21.825 10:38:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.825 10:38:38 -- accel/accel.sh@20 -- # IFS=: 00:07:21.825 10:38:38 -- accel/accel.sh@20 -- # read -r var val 00:07:21.825 10:38:38 -- accel/accel.sh@21 -- # val=software 00:07:21.825 10:38:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.825 10:38:38 -- accel/accel.sh@23 -- # accel_module=software 00:07:21.825 10:38:38 -- accel/accel.sh@20 -- # IFS=: 00:07:21.825 10:38:38 -- accel/accel.sh@20 -- # read -r var val 00:07:21.825 10:38:38 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:21.825 10:38:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.825 10:38:38 -- accel/accel.sh@20 -- # IFS=: 00:07:21.825 10:38:38 -- accel/accel.sh@20 -- # read -r var val 00:07:21.825 10:38:38 -- accel/accel.sh@21 -- # val=32 00:07:21.825 10:38:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.825 10:38:38 -- accel/accel.sh@20 -- # IFS=: 00:07:21.825 10:38:38 -- accel/accel.sh@20 -- # read -r var val 00:07:21.825 10:38:38 -- accel/accel.sh@21 -- # val=32 00:07:21.825 10:38:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.825 10:38:38 -- accel/accel.sh@20 -- # IFS=: 00:07:21.825 10:38:38 -- accel/accel.sh@20 -- # read -r var val 00:07:21.825 10:38:38 -- accel/accel.sh@21 -- # val=1 00:07:21.825 10:38:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.825 10:38:38 -- accel/accel.sh@20 -- # IFS=: 00:07:21.825 10:38:38 -- accel/accel.sh@20 -- # read -r var val 00:07:21.825 10:38:38 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:21.825 10:38:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.825 10:38:38 -- accel/accel.sh@20 -- # IFS=: 00:07:21.825 10:38:38 -- accel/accel.sh@20 -- # read -r var val 00:07:21.825 10:38:38 -- accel/accel.sh@21 -- # val=Yes 00:07:21.825 10:38:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.825 10:38:38 -- accel/accel.sh@20 -- # IFS=: 00:07:21.825 10:38:38 -- accel/accel.sh@20 -- # read -r var val 00:07:21.825 10:38:38 -- accel/accel.sh@21 -- # val= 00:07:21.825 10:38:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.825 10:38:38 -- accel/accel.sh@20 -- # IFS=: 00:07:21.825 10:38:38 -- accel/accel.sh@20 -- # read -r var val 00:07:21.825 10:38:38 -- accel/accel.sh@21 -- # val= 00:07:21.825 10:38:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.825 10:38:38 -- accel/accel.sh@20 -- # IFS=: 00:07:21.825 10:38:38 -- accel/accel.sh@20 -- # read -r var val 00:07:23.205 10:38:39 -- accel/accel.sh@21 -- # val= 00:07:23.205 10:38:39 -- accel/accel.sh@22 -- # case "$var" in 00:07:23.205 10:38:39 -- accel/accel.sh@20 -- # IFS=: 00:07:23.205 10:38:39 -- accel/accel.sh@20 -- # read -r var val 00:07:23.205 10:38:39 -- accel/accel.sh@21 -- # val= 00:07:23.205 10:38:39 -- accel/accel.sh@22 -- # case "$var" in 00:07:23.205 10:38:39 -- accel/accel.sh@20 -- # IFS=: 00:07:23.205 10:38:39 -- accel/accel.sh@20 -- # read -r var val 00:07:23.205 10:38:39 -- accel/accel.sh@21 -- # val= 00:07:23.205 10:38:39 -- accel/accel.sh@22 -- # case "$var" in 00:07:23.205 10:38:39 -- accel/accel.sh@20 -- # IFS=: 00:07:23.205 10:38:39 -- accel/accel.sh@20 -- # read -r var val 00:07:23.205 10:38:39 -- accel/accel.sh@21 -- # val= 00:07:23.205 10:38:39 -- accel/accel.sh@22 -- # case "$var" in 00:07:23.205 10:38:39 -- accel/accel.sh@20 -- # IFS=: 00:07:23.205 10:38:39 -- accel/accel.sh@20 -- # read -r var val 00:07:23.205 10:38:39 -- accel/accel.sh@21 -- # val= 00:07:23.205 10:38:39 -- accel/accel.sh@22 -- # case "$var" in 00:07:23.205 10:38:39 -- accel/accel.sh@20 -- # IFS=: 00:07:23.205 10:38:39 -- accel/accel.sh@20 -- # read -r var val 00:07:23.205 10:38:39 -- accel/accel.sh@21 -- # val= 00:07:23.205 10:38:39 -- accel/accel.sh@22 -- # case "$var" in 00:07:23.205 10:38:39 -- accel/accel.sh@20 -- # IFS=: 00:07:23.205 10:38:39 -- accel/accel.sh@20 -- # read -r var val 00:07:23.205 10:38:39 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:23.205 10:38:39 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:07:23.205 10:38:39 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:23.205 00:07:23.205 real 0m2.573s 00:07:23.205 user 0m2.329s 00:07:23.205 sys 0m0.254s 00:07:23.205 10:38:39 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:23.205 10:38:39 -- common/autotest_common.sh@10 -- # set +x 00:07:23.205 ************************************ 00:07:23.205 END TEST accel_decomp 00:07:23.205 ************************************ 00:07:23.205 10:38:39 -- accel/accel.sh@110 -- # run_test accel_decmop_full accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:23.205 10:38:39 -- common/autotest_common.sh@1077 -- # '[' 11 -le 1 ']' 00:07:23.205 10:38:39 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:23.205 10:38:39 -- common/autotest_common.sh@10 -- # set +x 00:07:23.205 ************************************ 00:07:23.205 START TEST accel_decmop_full 00:07:23.205 ************************************ 00:07:23.205 10:38:39 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:23.205 10:38:39 -- accel/accel.sh@16 -- # local accel_opc 00:07:23.205 10:38:39 -- accel/accel.sh@17 -- # local accel_module 00:07:23.205 10:38:39 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:23.205 10:38:39 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:23.205 10:38:39 -- accel/accel.sh@12 -- # build_accel_config 00:07:23.205 10:38:39 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:23.205 10:38:39 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:23.205 10:38:39 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:23.205 10:38:39 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:23.205 10:38:39 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:23.205 10:38:39 -- accel/accel.sh@41 -- # local IFS=, 00:07:23.205 10:38:39 -- accel/accel.sh@42 -- # jq -r . 00:07:23.205 [2024-07-13 10:38:39.256249] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:07:23.205 [2024-07-13 10:38:39.256334] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1985486 ] 00:07:23.205 EAL: No free 2048 kB hugepages reported on node 1 00:07:23.205 [2024-07-13 10:38:39.325014] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:23.205 [2024-07-13 10:38:39.359829] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:24.584 10:38:40 -- accel/accel.sh@18 -- # out='Preparing input file... 00:07:24.584 00:07:24.584 SPDK Configuration: 00:07:24.584 Core mask: 0x1 00:07:24.584 00:07:24.584 Accel Perf Configuration: 00:07:24.584 Workload Type: decompress 00:07:24.584 Transfer size: 111250 bytes 00:07:24.584 Vector count 1 00:07:24.584 Module: software 00:07:24.584 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:24.584 Queue depth: 32 00:07:24.584 Allocate depth: 32 00:07:24.584 # threads/core: 1 00:07:24.584 Run time: 1 seconds 00:07:24.584 Verify: Yes 00:07:24.584 00:07:24.584 Running for 1 seconds... 00:07:24.584 00:07:24.584 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:24.584 ------------------------------------------------------------------------------------ 00:07:24.584 0,0 5792/s 239 MiB/s 0 0 00:07:24.584 ==================================================================================== 00:07:24.584 Total 5792/s 614 MiB/s 0 0' 00:07:24.584 10:38:40 -- accel/accel.sh@20 -- # IFS=: 00:07:24.584 10:38:40 -- accel/accel.sh@20 -- # read -r var val 00:07:24.584 10:38:40 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:24.584 10:38:40 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:24.584 10:38:40 -- accel/accel.sh@12 -- # build_accel_config 00:07:24.584 10:38:40 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:24.584 10:38:40 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:24.584 10:38:40 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:24.584 10:38:40 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:24.584 10:38:40 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:24.584 10:38:40 -- accel/accel.sh@41 -- # local IFS=, 00:07:24.584 10:38:40 -- accel/accel.sh@42 -- # jq -r . 00:07:24.584 [2024-07-13 10:38:40.551717] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:07:24.584 [2024-07-13 10:38:40.551806] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1985752 ] 00:07:24.584 EAL: No free 2048 kB hugepages reported on node 1 00:07:24.584 [2024-07-13 10:38:40.621282] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:24.584 [2024-07-13 10:38:40.655343] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:24.584 10:38:40 -- accel/accel.sh@21 -- # val= 00:07:24.584 10:38:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.584 10:38:40 -- accel/accel.sh@20 -- # IFS=: 00:07:24.584 10:38:40 -- accel/accel.sh@20 -- # read -r var val 00:07:24.584 10:38:40 -- accel/accel.sh@21 -- # val= 00:07:24.585 10:38:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.585 10:38:40 -- accel/accel.sh@20 -- # IFS=: 00:07:24.585 10:38:40 -- accel/accel.sh@20 -- # read -r var val 00:07:24.585 10:38:40 -- accel/accel.sh@21 -- # val= 00:07:24.585 10:38:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.585 10:38:40 -- accel/accel.sh@20 -- # IFS=: 00:07:24.585 10:38:40 -- accel/accel.sh@20 -- # read -r var val 00:07:24.585 10:38:40 -- accel/accel.sh@21 -- # val=0x1 00:07:24.585 10:38:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.585 10:38:40 -- accel/accel.sh@20 -- # IFS=: 00:07:24.585 10:38:40 -- accel/accel.sh@20 -- # read -r var val 00:07:24.585 10:38:40 -- accel/accel.sh@21 -- # val= 00:07:24.585 10:38:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.585 10:38:40 -- accel/accel.sh@20 -- # IFS=: 00:07:24.585 10:38:40 -- accel/accel.sh@20 -- # read -r var val 00:07:24.585 10:38:40 -- accel/accel.sh@21 -- # val= 00:07:24.585 10:38:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.585 10:38:40 -- accel/accel.sh@20 -- # IFS=: 00:07:24.585 10:38:40 -- accel/accel.sh@20 -- # read -r var val 00:07:24.585 10:38:40 -- accel/accel.sh@21 -- # val=decompress 00:07:24.585 10:38:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.585 10:38:40 -- accel/accel.sh@24 -- # accel_opc=decompress 00:07:24.585 10:38:40 -- accel/accel.sh@20 -- # IFS=: 00:07:24.585 10:38:40 -- accel/accel.sh@20 -- # read -r var val 00:07:24.585 10:38:40 -- accel/accel.sh@21 -- # val='111250 bytes' 00:07:24.585 10:38:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.585 10:38:40 -- accel/accel.sh@20 -- # IFS=: 00:07:24.585 10:38:40 -- accel/accel.sh@20 -- # read -r var val 00:07:24.585 10:38:40 -- accel/accel.sh@21 -- # val= 00:07:24.585 10:38:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.585 10:38:40 -- accel/accel.sh@20 -- # IFS=: 00:07:24.585 10:38:40 -- accel/accel.sh@20 -- # read -r var val 00:07:24.585 10:38:40 -- accel/accel.sh@21 -- # val=software 00:07:24.585 10:38:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.585 10:38:40 -- accel/accel.sh@23 -- # accel_module=software 00:07:24.585 10:38:40 -- accel/accel.sh@20 -- # IFS=: 00:07:24.585 10:38:40 -- accel/accel.sh@20 -- # read -r var val 00:07:24.585 10:38:40 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:24.585 10:38:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.585 10:38:40 -- accel/accel.sh@20 -- # IFS=: 00:07:24.585 10:38:40 -- accel/accel.sh@20 -- # read -r var val 00:07:24.585 10:38:40 -- accel/accel.sh@21 -- # val=32 00:07:24.585 10:38:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.585 10:38:40 -- accel/accel.sh@20 -- # IFS=: 00:07:24.585 10:38:40 -- accel/accel.sh@20 -- # read -r var val 00:07:24.585 10:38:40 -- accel/accel.sh@21 -- # val=32 00:07:24.585 10:38:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.585 10:38:40 -- accel/accel.sh@20 -- # IFS=: 00:07:24.585 10:38:40 -- accel/accel.sh@20 -- # read -r var val 00:07:24.585 10:38:40 -- accel/accel.sh@21 -- # val=1 00:07:24.585 10:38:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.585 10:38:40 -- accel/accel.sh@20 -- # IFS=: 00:07:24.585 10:38:40 -- accel/accel.sh@20 -- # read -r var val 00:07:24.585 10:38:40 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:24.585 10:38:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.585 10:38:40 -- accel/accel.sh@20 -- # IFS=: 00:07:24.585 10:38:40 -- accel/accel.sh@20 -- # read -r var val 00:07:24.585 10:38:40 -- accel/accel.sh@21 -- # val=Yes 00:07:24.585 10:38:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.585 10:38:40 -- accel/accel.sh@20 -- # IFS=: 00:07:24.585 10:38:40 -- accel/accel.sh@20 -- # read -r var val 00:07:24.585 10:38:40 -- accel/accel.sh@21 -- # val= 00:07:24.585 10:38:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.585 10:38:40 -- accel/accel.sh@20 -- # IFS=: 00:07:24.585 10:38:40 -- accel/accel.sh@20 -- # read -r var val 00:07:24.585 10:38:40 -- accel/accel.sh@21 -- # val= 00:07:24.585 10:38:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.585 10:38:40 -- accel/accel.sh@20 -- # IFS=: 00:07:24.585 10:38:40 -- accel/accel.sh@20 -- # read -r var val 00:07:25.523 10:38:41 -- accel/accel.sh@21 -- # val= 00:07:25.523 10:38:41 -- accel/accel.sh@22 -- # case "$var" in 00:07:25.523 10:38:41 -- accel/accel.sh@20 -- # IFS=: 00:07:25.523 10:38:41 -- accel/accel.sh@20 -- # read -r var val 00:07:25.523 10:38:41 -- accel/accel.sh@21 -- # val= 00:07:25.523 10:38:41 -- accel/accel.sh@22 -- # case "$var" in 00:07:25.523 10:38:41 -- accel/accel.sh@20 -- # IFS=: 00:07:25.523 10:38:41 -- accel/accel.sh@20 -- # read -r var val 00:07:25.523 10:38:41 -- accel/accel.sh@21 -- # val= 00:07:25.523 10:38:41 -- accel/accel.sh@22 -- # case "$var" in 00:07:25.523 10:38:41 -- accel/accel.sh@20 -- # IFS=: 00:07:25.523 10:38:41 -- accel/accel.sh@20 -- # read -r var val 00:07:25.523 10:38:41 -- accel/accel.sh@21 -- # val= 00:07:25.523 10:38:41 -- accel/accel.sh@22 -- # case "$var" in 00:07:25.523 10:38:41 -- accel/accel.sh@20 -- # IFS=: 00:07:25.523 10:38:41 -- accel/accel.sh@20 -- # read -r var val 00:07:25.523 10:38:41 -- accel/accel.sh@21 -- # val= 00:07:25.523 10:38:41 -- accel/accel.sh@22 -- # case "$var" in 00:07:25.523 10:38:41 -- accel/accel.sh@20 -- # IFS=: 00:07:25.523 10:38:41 -- accel/accel.sh@20 -- # read -r var val 00:07:25.523 10:38:41 -- accel/accel.sh@21 -- # val= 00:07:25.523 10:38:41 -- accel/accel.sh@22 -- # case "$var" in 00:07:25.523 10:38:41 -- accel/accel.sh@20 -- # IFS=: 00:07:25.523 10:38:41 -- accel/accel.sh@20 -- # read -r var val 00:07:25.523 10:38:41 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:25.523 10:38:41 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:07:25.523 10:38:41 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:25.523 00:07:25.523 real 0m2.596s 00:07:25.523 user 0m2.345s 00:07:25.523 sys 0m0.259s 00:07:25.523 10:38:41 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:25.523 10:38:41 -- common/autotest_common.sh@10 -- # set +x 00:07:25.523 ************************************ 00:07:25.523 END TEST accel_decmop_full 00:07:25.523 ************************************ 00:07:25.523 10:38:41 -- accel/accel.sh@111 -- # run_test accel_decomp_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:25.523 10:38:41 -- common/autotest_common.sh@1077 -- # '[' 11 -le 1 ']' 00:07:25.523 10:38:41 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:25.523 10:38:41 -- common/autotest_common.sh@10 -- # set +x 00:07:25.523 ************************************ 00:07:25.523 START TEST accel_decomp_mcore 00:07:25.523 ************************************ 00:07:25.523 10:38:41 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:25.523 10:38:41 -- accel/accel.sh@16 -- # local accel_opc 00:07:25.523 10:38:41 -- accel/accel.sh@17 -- # local accel_module 00:07:25.523 10:38:41 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:25.523 10:38:41 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:25.523 10:38:41 -- accel/accel.sh@12 -- # build_accel_config 00:07:25.523 10:38:41 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:25.523 10:38:41 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:25.523 10:38:41 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:25.523 10:38:41 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:25.523 10:38:41 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:25.523 10:38:41 -- accel/accel.sh@41 -- # local IFS=, 00:07:25.523 10:38:41 -- accel/accel.sh@42 -- # jq -r . 00:07:25.523 [2024-07-13 10:38:41.899482] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:07:25.523 [2024-07-13 10:38:41.899572] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1986035 ] 00:07:25.782 EAL: No free 2048 kB hugepages reported on node 1 00:07:25.782 [2024-07-13 10:38:41.968479] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:25.782 [2024-07-13 10:38:42.006134] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:25.782 [2024-07-13 10:38:42.006229] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:07:25.782 [2024-07-13 10:38:42.006317] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:07:25.782 [2024-07-13 10:38:42.006319] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:27.157 10:38:43 -- accel/accel.sh@18 -- # out='Preparing input file... 00:07:27.157 00:07:27.157 SPDK Configuration: 00:07:27.157 Core mask: 0xf 00:07:27.157 00:07:27.157 Accel Perf Configuration: 00:07:27.157 Workload Type: decompress 00:07:27.157 Transfer size: 4096 bytes 00:07:27.157 Vector count 1 00:07:27.157 Module: software 00:07:27.157 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:27.157 Queue depth: 32 00:07:27.157 Allocate depth: 32 00:07:27.157 # threads/core: 1 00:07:27.157 Run time: 1 seconds 00:07:27.157 Verify: Yes 00:07:27.157 00:07:27.157 Running for 1 seconds... 00:07:27.157 00:07:27.157 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:27.157 ------------------------------------------------------------------------------------ 00:07:27.157 0,0 78624/s 144 MiB/s 0 0 00:07:27.157 3,0 78976/s 145 MiB/s 0 0 00:07:27.157 2,0 78784/s 145 MiB/s 0 0 00:07:27.157 1,0 78944/s 145 MiB/s 0 0 00:07:27.157 ==================================================================================== 00:07:27.157 Total 315328/s 1231 MiB/s 0 0' 00:07:27.157 10:38:43 -- accel/accel.sh@20 -- # IFS=: 00:07:27.157 10:38:43 -- accel/accel.sh@20 -- # read -r var val 00:07:27.158 10:38:43 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:27.158 10:38:43 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:27.158 10:38:43 -- accel/accel.sh@12 -- # build_accel_config 00:07:27.158 10:38:43 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:27.158 10:38:43 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:27.158 10:38:43 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:27.158 10:38:43 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:27.158 10:38:43 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:27.158 10:38:43 -- accel/accel.sh@41 -- # local IFS=, 00:07:27.158 10:38:43 -- accel/accel.sh@42 -- # jq -r . 00:07:27.158 [2024-07-13 10:38:43.193760] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:07:27.158 [2024-07-13 10:38:43.193849] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1986183 ] 00:07:27.158 EAL: No free 2048 kB hugepages reported on node 1 00:07:27.158 [2024-07-13 10:38:43.263008] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:27.158 [2024-07-13 10:38:43.300059] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:27.158 [2024-07-13 10:38:43.300154] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:07:27.158 [2024-07-13 10:38:43.300242] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:07:27.158 [2024-07-13 10:38:43.300244] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:27.158 10:38:43 -- accel/accel.sh@21 -- # val= 00:07:27.158 10:38:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:27.158 10:38:43 -- accel/accel.sh@20 -- # IFS=: 00:07:27.158 10:38:43 -- accel/accel.sh@20 -- # read -r var val 00:07:27.158 10:38:43 -- accel/accel.sh@21 -- # val= 00:07:27.158 10:38:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:27.158 10:38:43 -- accel/accel.sh@20 -- # IFS=: 00:07:27.158 10:38:43 -- accel/accel.sh@20 -- # read -r var val 00:07:27.158 10:38:43 -- accel/accel.sh@21 -- # val= 00:07:27.158 10:38:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:27.158 10:38:43 -- accel/accel.sh@20 -- # IFS=: 00:07:27.158 10:38:43 -- accel/accel.sh@20 -- # read -r var val 00:07:27.158 10:38:43 -- accel/accel.sh@21 -- # val=0xf 00:07:27.158 10:38:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:27.158 10:38:43 -- accel/accel.sh@20 -- # IFS=: 00:07:27.158 10:38:43 -- accel/accel.sh@20 -- # read -r var val 00:07:27.158 10:38:43 -- accel/accel.sh@21 -- # val= 00:07:27.158 10:38:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:27.158 10:38:43 -- accel/accel.sh@20 -- # IFS=: 00:07:27.158 10:38:43 -- accel/accel.sh@20 -- # read -r var val 00:07:27.158 10:38:43 -- accel/accel.sh@21 -- # val= 00:07:27.158 10:38:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:27.158 10:38:43 -- accel/accel.sh@20 -- # IFS=: 00:07:27.158 10:38:43 -- accel/accel.sh@20 -- # read -r var val 00:07:27.158 10:38:43 -- accel/accel.sh@21 -- # val=decompress 00:07:27.158 10:38:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:27.158 10:38:43 -- accel/accel.sh@24 -- # accel_opc=decompress 00:07:27.158 10:38:43 -- accel/accel.sh@20 -- # IFS=: 00:07:27.158 10:38:43 -- accel/accel.sh@20 -- # read -r var val 00:07:27.158 10:38:43 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:27.158 10:38:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:27.158 10:38:43 -- accel/accel.sh@20 -- # IFS=: 00:07:27.158 10:38:43 -- accel/accel.sh@20 -- # read -r var val 00:07:27.158 10:38:43 -- accel/accel.sh@21 -- # val= 00:07:27.158 10:38:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:27.158 10:38:43 -- accel/accel.sh@20 -- # IFS=: 00:07:27.158 10:38:43 -- accel/accel.sh@20 -- # read -r var val 00:07:27.158 10:38:43 -- accel/accel.sh@21 -- # val=software 00:07:27.158 10:38:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:27.158 10:38:43 -- accel/accel.sh@23 -- # accel_module=software 00:07:27.158 10:38:43 -- accel/accel.sh@20 -- # IFS=: 00:07:27.158 10:38:43 -- accel/accel.sh@20 -- # read -r var val 00:07:27.158 10:38:43 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:27.158 10:38:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:27.158 10:38:43 -- accel/accel.sh@20 -- # IFS=: 00:07:27.158 10:38:43 -- accel/accel.sh@20 -- # read -r var val 00:07:27.158 10:38:43 -- accel/accel.sh@21 -- # val=32 00:07:27.158 10:38:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:27.158 10:38:43 -- accel/accel.sh@20 -- # IFS=: 00:07:27.158 10:38:43 -- accel/accel.sh@20 -- # read -r var val 00:07:27.158 10:38:43 -- accel/accel.sh@21 -- # val=32 00:07:27.158 10:38:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:27.158 10:38:43 -- accel/accel.sh@20 -- # IFS=: 00:07:27.158 10:38:43 -- accel/accel.sh@20 -- # read -r var val 00:07:27.158 10:38:43 -- accel/accel.sh@21 -- # val=1 00:07:27.158 10:38:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:27.158 10:38:43 -- accel/accel.sh@20 -- # IFS=: 00:07:27.158 10:38:43 -- accel/accel.sh@20 -- # read -r var val 00:07:27.158 10:38:43 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:27.158 10:38:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:27.158 10:38:43 -- accel/accel.sh@20 -- # IFS=: 00:07:27.158 10:38:43 -- accel/accel.sh@20 -- # read -r var val 00:07:27.158 10:38:43 -- accel/accel.sh@21 -- # val=Yes 00:07:27.158 10:38:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:27.158 10:38:43 -- accel/accel.sh@20 -- # IFS=: 00:07:27.158 10:38:43 -- accel/accel.sh@20 -- # read -r var val 00:07:27.158 10:38:43 -- accel/accel.sh@21 -- # val= 00:07:27.158 10:38:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:27.158 10:38:43 -- accel/accel.sh@20 -- # IFS=: 00:07:27.158 10:38:43 -- accel/accel.sh@20 -- # read -r var val 00:07:27.158 10:38:43 -- accel/accel.sh@21 -- # val= 00:07:27.158 10:38:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:27.158 10:38:43 -- accel/accel.sh@20 -- # IFS=: 00:07:27.158 10:38:43 -- accel/accel.sh@20 -- # read -r var val 00:07:28.121 10:38:44 -- accel/accel.sh@21 -- # val= 00:07:28.121 10:38:44 -- accel/accel.sh@22 -- # case "$var" in 00:07:28.121 10:38:44 -- accel/accel.sh@20 -- # IFS=: 00:07:28.121 10:38:44 -- accel/accel.sh@20 -- # read -r var val 00:07:28.121 10:38:44 -- accel/accel.sh@21 -- # val= 00:07:28.121 10:38:44 -- accel/accel.sh@22 -- # case "$var" in 00:07:28.121 10:38:44 -- accel/accel.sh@20 -- # IFS=: 00:07:28.121 10:38:44 -- accel/accel.sh@20 -- # read -r var val 00:07:28.121 10:38:44 -- accel/accel.sh@21 -- # val= 00:07:28.121 10:38:44 -- accel/accel.sh@22 -- # case "$var" in 00:07:28.121 10:38:44 -- accel/accel.sh@20 -- # IFS=: 00:07:28.121 10:38:44 -- accel/accel.sh@20 -- # read -r var val 00:07:28.121 10:38:44 -- accel/accel.sh@21 -- # val= 00:07:28.121 10:38:44 -- accel/accel.sh@22 -- # case "$var" in 00:07:28.121 10:38:44 -- accel/accel.sh@20 -- # IFS=: 00:07:28.121 10:38:44 -- accel/accel.sh@20 -- # read -r var val 00:07:28.121 10:38:44 -- accel/accel.sh@21 -- # val= 00:07:28.121 10:38:44 -- accel/accel.sh@22 -- # case "$var" in 00:07:28.121 10:38:44 -- accel/accel.sh@20 -- # IFS=: 00:07:28.121 10:38:44 -- accel/accel.sh@20 -- # read -r var val 00:07:28.121 10:38:44 -- accel/accel.sh@21 -- # val= 00:07:28.121 10:38:44 -- accel/accel.sh@22 -- # case "$var" in 00:07:28.121 10:38:44 -- accel/accel.sh@20 -- # IFS=: 00:07:28.121 10:38:44 -- accel/accel.sh@20 -- # read -r var val 00:07:28.121 10:38:44 -- accel/accel.sh@21 -- # val= 00:07:28.121 10:38:44 -- accel/accel.sh@22 -- # case "$var" in 00:07:28.121 10:38:44 -- accel/accel.sh@20 -- # IFS=: 00:07:28.121 10:38:44 -- accel/accel.sh@20 -- # read -r var val 00:07:28.121 10:38:44 -- accel/accel.sh@21 -- # val= 00:07:28.121 10:38:44 -- accel/accel.sh@22 -- # case "$var" in 00:07:28.121 10:38:44 -- accel/accel.sh@20 -- # IFS=: 00:07:28.121 10:38:44 -- accel/accel.sh@20 -- # read -r var val 00:07:28.121 10:38:44 -- accel/accel.sh@21 -- # val= 00:07:28.121 10:38:44 -- accel/accel.sh@22 -- # case "$var" in 00:07:28.121 10:38:44 -- accel/accel.sh@20 -- # IFS=: 00:07:28.121 10:38:44 -- accel/accel.sh@20 -- # read -r var val 00:07:28.121 10:38:44 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:28.121 10:38:44 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:07:28.121 10:38:44 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:28.121 00:07:28.121 real 0m2.599s 00:07:28.121 user 0m8.982s 00:07:28.121 sys 0m0.280s 00:07:28.121 10:38:44 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:28.121 10:38:44 -- common/autotest_common.sh@10 -- # set +x 00:07:28.121 ************************************ 00:07:28.121 END TEST accel_decomp_mcore 00:07:28.121 ************************************ 00:07:28.379 10:38:44 -- accel/accel.sh@112 -- # run_test accel_decomp_full_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:28.379 10:38:44 -- common/autotest_common.sh@1077 -- # '[' 13 -le 1 ']' 00:07:28.379 10:38:44 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:28.379 10:38:44 -- common/autotest_common.sh@10 -- # set +x 00:07:28.379 ************************************ 00:07:28.379 START TEST accel_decomp_full_mcore 00:07:28.379 ************************************ 00:07:28.379 10:38:44 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:28.379 10:38:44 -- accel/accel.sh@16 -- # local accel_opc 00:07:28.379 10:38:44 -- accel/accel.sh@17 -- # local accel_module 00:07:28.379 10:38:44 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:28.379 10:38:44 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:28.379 10:38:44 -- accel/accel.sh@12 -- # build_accel_config 00:07:28.379 10:38:44 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:28.379 10:38:44 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:28.379 10:38:44 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:28.379 10:38:44 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:28.379 10:38:44 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:28.379 10:38:44 -- accel/accel.sh@41 -- # local IFS=, 00:07:28.379 10:38:44 -- accel/accel.sh@42 -- # jq -r . 00:07:28.379 [2024-07-13 10:38:44.547455] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:07:28.379 [2024-07-13 10:38:44.547546] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1986371 ] 00:07:28.379 EAL: No free 2048 kB hugepages reported on node 1 00:07:28.379 [2024-07-13 10:38:44.617818] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:28.379 [2024-07-13 10:38:44.655744] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:28.379 [2024-07-13 10:38:44.655839] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:07:28.379 [2024-07-13 10:38:44.655927] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:07:28.379 [2024-07-13 10:38:44.655929] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:29.755 10:38:45 -- accel/accel.sh@18 -- # out='Preparing input file... 00:07:29.755 00:07:29.755 SPDK Configuration: 00:07:29.755 Core mask: 0xf 00:07:29.755 00:07:29.755 Accel Perf Configuration: 00:07:29.755 Workload Type: decompress 00:07:29.755 Transfer size: 111250 bytes 00:07:29.755 Vector count 1 00:07:29.755 Module: software 00:07:29.755 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:29.755 Queue depth: 32 00:07:29.755 Allocate depth: 32 00:07:29.755 # threads/core: 1 00:07:29.755 Run time: 1 seconds 00:07:29.755 Verify: Yes 00:07:29.755 00:07:29.755 Running for 1 seconds... 00:07:29.755 00:07:29.755 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:29.755 ------------------------------------------------------------------------------------ 00:07:29.755 0,0 5792/s 239 MiB/s 0 0 00:07:29.755 3,0 5824/s 240 MiB/s 0 0 00:07:29.755 2,0 5824/s 240 MiB/s 0 0 00:07:29.755 1,0 5824/s 240 MiB/s 0 0 00:07:29.755 ==================================================================================== 00:07:29.755 Total 23264/s 2468 MiB/s 0 0' 00:07:29.755 10:38:45 -- accel/accel.sh@20 -- # IFS=: 00:07:29.755 10:38:45 -- accel/accel.sh@20 -- # read -r var val 00:07:29.755 10:38:45 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:29.755 10:38:45 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:29.755 10:38:45 -- accel/accel.sh@12 -- # build_accel_config 00:07:29.755 10:38:45 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:29.755 10:38:45 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:29.755 10:38:45 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:29.755 10:38:45 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:29.755 10:38:45 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:29.755 10:38:45 -- accel/accel.sh@41 -- # local IFS=, 00:07:29.755 10:38:45 -- accel/accel.sh@42 -- # jq -r . 00:07:29.755 [2024-07-13 10:38:45.852476] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:07:29.755 [2024-07-13 10:38:45.852568] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1986624 ] 00:07:29.755 EAL: No free 2048 kB hugepages reported on node 1 00:07:29.755 [2024-07-13 10:38:45.924488] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:29.755 [2024-07-13 10:38:45.961428] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:29.755 [2024-07-13 10:38:45.961526] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:07:29.755 [2024-07-13 10:38:45.961545] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:07:29.755 [2024-07-13 10:38:45.961547] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:29.756 10:38:46 -- accel/accel.sh@21 -- # val= 00:07:29.756 10:38:46 -- accel/accel.sh@22 -- # case "$var" in 00:07:29.756 10:38:46 -- accel/accel.sh@20 -- # IFS=: 00:07:29.756 10:38:46 -- accel/accel.sh@20 -- # read -r var val 00:07:29.756 10:38:46 -- accel/accel.sh@21 -- # val= 00:07:29.756 10:38:46 -- accel/accel.sh@22 -- # case "$var" in 00:07:29.756 10:38:46 -- accel/accel.sh@20 -- # IFS=: 00:07:29.756 10:38:46 -- accel/accel.sh@20 -- # read -r var val 00:07:29.756 10:38:46 -- accel/accel.sh@21 -- # val= 00:07:29.756 10:38:46 -- accel/accel.sh@22 -- # case "$var" in 00:07:29.756 10:38:46 -- accel/accel.sh@20 -- # IFS=: 00:07:29.756 10:38:46 -- accel/accel.sh@20 -- # read -r var val 00:07:29.756 10:38:46 -- accel/accel.sh@21 -- # val=0xf 00:07:29.756 10:38:46 -- accel/accel.sh@22 -- # case "$var" in 00:07:29.756 10:38:46 -- accel/accel.sh@20 -- # IFS=: 00:07:29.756 10:38:46 -- accel/accel.sh@20 -- # read -r var val 00:07:29.756 10:38:46 -- accel/accel.sh@21 -- # val= 00:07:29.756 10:38:46 -- accel/accel.sh@22 -- # case "$var" in 00:07:29.756 10:38:46 -- accel/accel.sh@20 -- # IFS=: 00:07:29.756 10:38:46 -- accel/accel.sh@20 -- # read -r var val 00:07:29.756 10:38:46 -- accel/accel.sh@21 -- # val= 00:07:29.756 10:38:46 -- accel/accel.sh@22 -- # case "$var" in 00:07:29.756 10:38:46 -- accel/accel.sh@20 -- # IFS=: 00:07:29.756 10:38:46 -- accel/accel.sh@20 -- # read -r var val 00:07:29.756 10:38:46 -- accel/accel.sh@21 -- # val=decompress 00:07:29.756 10:38:46 -- accel/accel.sh@22 -- # case "$var" in 00:07:29.756 10:38:46 -- accel/accel.sh@24 -- # accel_opc=decompress 00:07:29.756 10:38:46 -- accel/accel.sh@20 -- # IFS=: 00:07:29.756 10:38:46 -- accel/accel.sh@20 -- # read -r var val 00:07:29.756 10:38:46 -- accel/accel.sh@21 -- # val='111250 bytes' 00:07:29.756 10:38:46 -- accel/accel.sh@22 -- # case "$var" in 00:07:29.756 10:38:46 -- accel/accel.sh@20 -- # IFS=: 00:07:29.756 10:38:46 -- accel/accel.sh@20 -- # read -r var val 00:07:29.756 10:38:46 -- accel/accel.sh@21 -- # val= 00:07:29.756 10:38:46 -- accel/accel.sh@22 -- # case "$var" in 00:07:29.756 10:38:46 -- accel/accel.sh@20 -- # IFS=: 00:07:29.756 10:38:46 -- accel/accel.sh@20 -- # read -r var val 00:07:29.756 10:38:46 -- accel/accel.sh@21 -- # val=software 00:07:29.756 10:38:46 -- accel/accel.sh@22 -- # case "$var" in 00:07:29.756 10:38:46 -- accel/accel.sh@23 -- # accel_module=software 00:07:29.756 10:38:46 -- accel/accel.sh@20 -- # IFS=: 00:07:29.756 10:38:46 -- accel/accel.sh@20 -- # read -r var val 00:07:29.756 10:38:46 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:29.756 10:38:46 -- accel/accel.sh@22 -- # case "$var" in 00:07:29.756 10:38:46 -- accel/accel.sh@20 -- # IFS=: 00:07:29.756 10:38:46 -- accel/accel.sh@20 -- # read -r var val 00:07:29.756 10:38:46 -- accel/accel.sh@21 -- # val=32 00:07:29.756 10:38:46 -- accel/accel.sh@22 -- # case "$var" in 00:07:29.756 10:38:46 -- accel/accel.sh@20 -- # IFS=: 00:07:29.756 10:38:46 -- accel/accel.sh@20 -- # read -r var val 00:07:29.756 10:38:46 -- accel/accel.sh@21 -- # val=32 00:07:29.756 10:38:46 -- accel/accel.sh@22 -- # case "$var" in 00:07:29.756 10:38:46 -- accel/accel.sh@20 -- # IFS=: 00:07:29.756 10:38:46 -- accel/accel.sh@20 -- # read -r var val 00:07:29.756 10:38:46 -- accel/accel.sh@21 -- # val=1 00:07:29.756 10:38:46 -- accel/accel.sh@22 -- # case "$var" in 00:07:29.756 10:38:46 -- accel/accel.sh@20 -- # IFS=: 00:07:29.756 10:38:46 -- accel/accel.sh@20 -- # read -r var val 00:07:29.756 10:38:46 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:29.756 10:38:46 -- accel/accel.sh@22 -- # case "$var" in 00:07:29.756 10:38:46 -- accel/accel.sh@20 -- # IFS=: 00:07:29.756 10:38:46 -- accel/accel.sh@20 -- # read -r var val 00:07:29.756 10:38:46 -- accel/accel.sh@21 -- # val=Yes 00:07:29.756 10:38:46 -- accel/accel.sh@22 -- # case "$var" in 00:07:29.756 10:38:46 -- accel/accel.sh@20 -- # IFS=: 00:07:29.756 10:38:46 -- accel/accel.sh@20 -- # read -r var val 00:07:29.756 10:38:46 -- accel/accel.sh@21 -- # val= 00:07:29.756 10:38:46 -- accel/accel.sh@22 -- # case "$var" in 00:07:29.756 10:38:46 -- accel/accel.sh@20 -- # IFS=: 00:07:29.756 10:38:46 -- accel/accel.sh@20 -- # read -r var val 00:07:29.756 10:38:46 -- accel/accel.sh@21 -- # val= 00:07:29.756 10:38:46 -- accel/accel.sh@22 -- # case "$var" in 00:07:29.756 10:38:46 -- accel/accel.sh@20 -- # IFS=: 00:07:29.756 10:38:46 -- accel/accel.sh@20 -- # read -r var val 00:07:31.135 10:38:47 -- accel/accel.sh@21 -- # val= 00:07:31.135 10:38:47 -- accel/accel.sh@22 -- # case "$var" in 00:07:31.135 10:38:47 -- accel/accel.sh@20 -- # IFS=: 00:07:31.135 10:38:47 -- accel/accel.sh@20 -- # read -r var val 00:07:31.135 10:38:47 -- accel/accel.sh@21 -- # val= 00:07:31.135 10:38:47 -- accel/accel.sh@22 -- # case "$var" in 00:07:31.135 10:38:47 -- accel/accel.sh@20 -- # IFS=: 00:07:31.135 10:38:47 -- accel/accel.sh@20 -- # read -r var val 00:07:31.135 10:38:47 -- accel/accel.sh@21 -- # val= 00:07:31.135 10:38:47 -- accel/accel.sh@22 -- # case "$var" in 00:07:31.135 10:38:47 -- accel/accel.sh@20 -- # IFS=: 00:07:31.135 10:38:47 -- accel/accel.sh@20 -- # read -r var val 00:07:31.135 10:38:47 -- accel/accel.sh@21 -- # val= 00:07:31.135 10:38:47 -- accel/accel.sh@22 -- # case "$var" in 00:07:31.135 10:38:47 -- accel/accel.sh@20 -- # IFS=: 00:07:31.135 10:38:47 -- accel/accel.sh@20 -- # read -r var val 00:07:31.135 10:38:47 -- accel/accel.sh@21 -- # val= 00:07:31.135 10:38:47 -- accel/accel.sh@22 -- # case "$var" in 00:07:31.135 10:38:47 -- accel/accel.sh@20 -- # IFS=: 00:07:31.135 10:38:47 -- accel/accel.sh@20 -- # read -r var val 00:07:31.135 10:38:47 -- accel/accel.sh@21 -- # val= 00:07:31.135 10:38:47 -- accel/accel.sh@22 -- # case "$var" in 00:07:31.135 10:38:47 -- accel/accel.sh@20 -- # IFS=: 00:07:31.135 10:38:47 -- accel/accel.sh@20 -- # read -r var val 00:07:31.135 10:38:47 -- accel/accel.sh@21 -- # val= 00:07:31.135 10:38:47 -- accel/accel.sh@22 -- # case "$var" in 00:07:31.135 10:38:47 -- accel/accel.sh@20 -- # IFS=: 00:07:31.135 10:38:47 -- accel/accel.sh@20 -- # read -r var val 00:07:31.135 10:38:47 -- accel/accel.sh@21 -- # val= 00:07:31.135 10:38:47 -- accel/accel.sh@22 -- # case "$var" in 00:07:31.135 10:38:47 -- accel/accel.sh@20 -- # IFS=: 00:07:31.135 10:38:47 -- accel/accel.sh@20 -- # read -r var val 00:07:31.135 10:38:47 -- accel/accel.sh@21 -- # val= 00:07:31.135 10:38:47 -- accel/accel.sh@22 -- # case "$var" in 00:07:31.135 10:38:47 -- accel/accel.sh@20 -- # IFS=: 00:07:31.135 10:38:47 -- accel/accel.sh@20 -- # read -r var val 00:07:31.135 10:38:47 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:31.135 10:38:47 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:07:31.135 10:38:47 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:31.135 00:07:31.135 real 0m2.623s 00:07:31.135 user 0m9.057s 00:07:31.135 sys 0m0.267s 00:07:31.135 10:38:47 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:31.135 10:38:47 -- common/autotest_common.sh@10 -- # set +x 00:07:31.135 ************************************ 00:07:31.135 END TEST accel_decomp_full_mcore 00:07:31.135 ************************************ 00:07:31.135 10:38:47 -- accel/accel.sh@113 -- # run_test accel_decomp_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:31.135 10:38:47 -- common/autotest_common.sh@1077 -- # '[' 11 -le 1 ']' 00:07:31.135 10:38:47 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:31.135 10:38:47 -- common/autotest_common.sh@10 -- # set +x 00:07:31.135 ************************************ 00:07:31.135 START TEST accel_decomp_mthread 00:07:31.135 ************************************ 00:07:31.135 10:38:47 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:31.135 10:38:47 -- accel/accel.sh@16 -- # local accel_opc 00:07:31.135 10:38:47 -- accel/accel.sh@17 -- # local accel_module 00:07:31.135 10:38:47 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:31.135 10:38:47 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:31.135 10:38:47 -- accel/accel.sh@12 -- # build_accel_config 00:07:31.135 10:38:47 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:31.135 10:38:47 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:31.135 10:38:47 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:31.135 10:38:47 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:31.135 10:38:47 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:31.135 10:38:47 -- accel/accel.sh@41 -- # local IFS=, 00:07:31.135 10:38:47 -- accel/accel.sh@42 -- # jq -r . 00:07:31.135 [2024-07-13 10:38:47.218505] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:07:31.135 [2024-07-13 10:38:47.218596] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1986914 ] 00:07:31.135 EAL: No free 2048 kB hugepages reported on node 1 00:07:31.135 [2024-07-13 10:38:47.289813] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:31.135 [2024-07-13 10:38:47.325246] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:32.513 10:38:48 -- accel/accel.sh@18 -- # out='Preparing input file... 00:07:32.513 00:07:32.513 SPDK Configuration: 00:07:32.513 Core mask: 0x1 00:07:32.513 00:07:32.513 Accel Perf Configuration: 00:07:32.513 Workload Type: decompress 00:07:32.513 Transfer size: 4096 bytes 00:07:32.513 Vector count 1 00:07:32.513 Module: software 00:07:32.513 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:32.513 Queue depth: 32 00:07:32.513 Allocate depth: 32 00:07:32.513 # threads/core: 2 00:07:32.513 Run time: 1 seconds 00:07:32.513 Verify: Yes 00:07:32.513 00:07:32.513 Running for 1 seconds... 00:07:32.513 00:07:32.513 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:32.513 ------------------------------------------------------------------------------------ 00:07:32.513 0,1 47008/s 86 MiB/s 0 0 00:07:32.513 0,0 46848/s 86 MiB/s 0 0 00:07:32.513 ==================================================================================== 00:07:32.513 Total 93856/s 366 MiB/s 0 0' 00:07:32.513 10:38:48 -- accel/accel.sh@20 -- # IFS=: 00:07:32.513 10:38:48 -- accel/accel.sh@20 -- # read -r var val 00:07:32.513 10:38:48 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:32.513 10:38:48 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:32.513 10:38:48 -- accel/accel.sh@12 -- # build_accel_config 00:07:32.513 10:38:48 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:32.513 10:38:48 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:32.513 10:38:48 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:32.513 10:38:48 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:32.513 10:38:48 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:32.513 10:38:48 -- accel/accel.sh@41 -- # local IFS=, 00:07:32.513 10:38:48 -- accel/accel.sh@42 -- # jq -r . 00:07:32.513 [2024-07-13 10:38:48.510177] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:07:32.513 [2024-07-13 10:38:48.510284] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1987182 ] 00:07:32.513 EAL: No free 2048 kB hugepages reported on node 1 00:07:32.513 [2024-07-13 10:38:48.577844] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:32.513 [2024-07-13 10:38:48.612021] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:32.513 10:38:48 -- accel/accel.sh@21 -- # val= 00:07:32.513 10:38:48 -- accel/accel.sh@22 -- # case "$var" in 00:07:32.513 10:38:48 -- accel/accel.sh@20 -- # IFS=: 00:07:32.513 10:38:48 -- accel/accel.sh@20 -- # read -r var val 00:07:32.513 10:38:48 -- accel/accel.sh@21 -- # val= 00:07:32.513 10:38:48 -- accel/accel.sh@22 -- # case "$var" in 00:07:32.513 10:38:48 -- accel/accel.sh@20 -- # IFS=: 00:07:32.513 10:38:48 -- accel/accel.sh@20 -- # read -r var val 00:07:32.513 10:38:48 -- accel/accel.sh@21 -- # val= 00:07:32.513 10:38:48 -- accel/accel.sh@22 -- # case "$var" in 00:07:32.513 10:38:48 -- accel/accel.sh@20 -- # IFS=: 00:07:32.513 10:38:48 -- accel/accel.sh@20 -- # read -r var val 00:07:32.513 10:38:48 -- accel/accel.sh@21 -- # val=0x1 00:07:32.513 10:38:48 -- accel/accel.sh@22 -- # case "$var" in 00:07:32.513 10:38:48 -- accel/accel.sh@20 -- # IFS=: 00:07:32.513 10:38:48 -- accel/accel.sh@20 -- # read -r var val 00:07:32.513 10:38:48 -- accel/accel.sh@21 -- # val= 00:07:32.513 10:38:48 -- accel/accel.sh@22 -- # case "$var" in 00:07:32.513 10:38:48 -- accel/accel.sh@20 -- # IFS=: 00:07:32.513 10:38:48 -- accel/accel.sh@20 -- # read -r var val 00:07:32.513 10:38:48 -- accel/accel.sh@21 -- # val= 00:07:32.513 10:38:48 -- accel/accel.sh@22 -- # case "$var" in 00:07:32.513 10:38:48 -- accel/accel.sh@20 -- # IFS=: 00:07:32.513 10:38:48 -- accel/accel.sh@20 -- # read -r var val 00:07:32.513 10:38:48 -- accel/accel.sh@21 -- # val=decompress 00:07:32.513 10:38:48 -- accel/accel.sh@22 -- # case "$var" in 00:07:32.513 10:38:48 -- accel/accel.sh@24 -- # accel_opc=decompress 00:07:32.513 10:38:48 -- accel/accel.sh@20 -- # IFS=: 00:07:32.513 10:38:48 -- accel/accel.sh@20 -- # read -r var val 00:07:32.513 10:38:48 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:32.513 10:38:48 -- accel/accel.sh@22 -- # case "$var" in 00:07:32.513 10:38:48 -- accel/accel.sh@20 -- # IFS=: 00:07:32.513 10:38:48 -- accel/accel.sh@20 -- # read -r var val 00:07:32.513 10:38:48 -- accel/accel.sh@21 -- # val= 00:07:32.513 10:38:48 -- accel/accel.sh@22 -- # case "$var" in 00:07:32.513 10:38:48 -- accel/accel.sh@20 -- # IFS=: 00:07:32.513 10:38:48 -- accel/accel.sh@20 -- # read -r var val 00:07:32.513 10:38:48 -- accel/accel.sh@21 -- # val=software 00:07:32.513 10:38:48 -- accel/accel.sh@22 -- # case "$var" in 00:07:32.513 10:38:48 -- accel/accel.sh@23 -- # accel_module=software 00:07:32.513 10:38:48 -- accel/accel.sh@20 -- # IFS=: 00:07:32.513 10:38:48 -- accel/accel.sh@20 -- # read -r var val 00:07:32.513 10:38:48 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:32.513 10:38:48 -- accel/accel.sh@22 -- # case "$var" in 00:07:32.513 10:38:48 -- accel/accel.sh@20 -- # IFS=: 00:07:32.513 10:38:48 -- accel/accel.sh@20 -- # read -r var val 00:07:32.513 10:38:48 -- accel/accel.sh@21 -- # val=32 00:07:32.513 10:38:48 -- accel/accel.sh@22 -- # case "$var" in 00:07:32.513 10:38:48 -- accel/accel.sh@20 -- # IFS=: 00:07:32.513 10:38:48 -- accel/accel.sh@20 -- # read -r var val 00:07:32.513 10:38:48 -- accel/accel.sh@21 -- # val=32 00:07:32.514 10:38:48 -- accel/accel.sh@22 -- # case "$var" in 00:07:32.514 10:38:48 -- accel/accel.sh@20 -- # IFS=: 00:07:32.514 10:38:48 -- accel/accel.sh@20 -- # read -r var val 00:07:32.514 10:38:48 -- accel/accel.sh@21 -- # val=2 00:07:32.514 10:38:48 -- accel/accel.sh@22 -- # case "$var" in 00:07:32.514 10:38:48 -- accel/accel.sh@20 -- # IFS=: 00:07:32.514 10:38:48 -- accel/accel.sh@20 -- # read -r var val 00:07:32.514 10:38:48 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:32.514 10:38:48 -- accel/accel.sh@22 -- # case "$var" in 00:07:32.514 10:38:48 -- accel/accel.sh@20 -- # IFS=: 00:07:32.514 10:38:48 -- accel/accel.sh@20 -- # read -r var val 00:07:32.514 10:38:48 -- accel/accel.sh@21 -- # val=Yes 00:07:32.514 10:38:48 -- accel/accel.sh@22 -- # case "$var" in 00:07:32.514 10:38:48 -- accel/accel.sh@20 -- # IFS=: 00:07:32.514 10:38:48 -- accel/accel.sh@20 -- # read -r var val 00:07:32.514 10:38:48 -- accel/accel.sh@21 -- # val= 00:07:32.514 10:38:48 -- accel/accel.sh@22 -- # case "$var" in 00:07:32.514 10:38:48 -- accel/accel.sh@20 -- # IFS=: 00:07:32.514 10:38:48 -- accel/accel.sh@20 -- # read -r var val 00:07:32.514 10:38:48 -- accel/accel.sh@21 -- # val= 00:07:32.514 10:38:48 -- accel/accel.sh@22 -- # case "$var" in 00:07:32.514 10:38:48 -- accel/accel.sh@20 -- # IFS=: 00:07:32.514 10:38:48 -- accel/accel.sh@20 -- # read -r var val 00:07:33.450 10:38:49 -- accel/accel.sh@21 -- # val= 00:07:33.450 10:38:49 -- accel/accel.sh@22 -- # case "$var" in 00:07:33.450 10:38:49 -- accel/accel.sh@20 -- # IFS=: 00:07:33.450 10:38:49 -- accel/accel.sh@20 -- # read -r var val 00:07:33.450 10:38:49 -- accel/accel.sh@21 -- # val= 00:07:33.450 10:38:49 -- accel/accel.sh@22 -- # case "$var" in 00:07:33.450 10:38:49 -- accel/accel.sh@20 -- # IFS=: 00:07:33.450 10:38:49 -- accel/accel.sh@20 -- # read -r var val 00:07:33.450 10:38:49 -- accel/accel.sh@21 -- # val= 00:07:33.450 10:38:49 -- accel/accel.sh@22 -- # case "$var" in 00:07:33.450 10:38:49 -- accel/accel.sh@20 -- # IFS=: 00:07:33.450 10:38:49 -- accel/accel.sh@20 -- # read -r var val 00:07:33.450 10:38:49 -- accel/accel.sh@21 -- # val= 00:07:33.450 10:38:49 -- accel/accel.sh@22 -- # case "$var" in 00:07:33.450 10:38:49 -- accel/accel.sh@20 -- # IFS=: 00:07:33.450 10:38:49 -- accel/accel.sh@20 -- # read -r var val 00:07:33.450 10:38:49 -- accel/accel.sh@21 -- # val= 00:07:33.450 10:38:49 -- accel/accel.sh@22 -- # case "$var" in 00:07:33.450 10:38:49 -- accel/accel.sh@20 -- # IFS=: 00:07:33.450 10:38:49 -- accel/accel.sh@20 -- # read -r var val 00:07:33.450 10:38:49 -- accel/accel.sh@21 -- # val= 00:07:33.450 10:38:49 -- accel/accel.sh@22 -- # case "$var" in 00:07:33.450 10:38:49 -- accel/accel.sh@20 -- # IFS=: 00:07:33.450 10:38:49 -- accel/accel.sh@20 -- # read -r var val 00:07:33.450 10:38:49 -- accel/accel.sh@21 -- # val= 00:07:33.450 10:38:49 -- accel/accel.sh@22 -- # case "$var" in 00:07:33.450 10:38:49 -- accel/accel.sh@20 -- # IFS=: 00:07:33.450 10:38:49 -- accel/accel.sh@20 -- # read -r var val 00:07:33.450 10:38:49 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:33.450 10:38:49 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:07:33.450 10:38:49 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:33.450 00:07:33.450 real 0m2.586s 00:07:33.450 user 0m2.336s 00:07:33.450 sys 0m0.261s 00:07:33.450 10:38:49 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:33.450 10:38:49 -- common/autotest_common.sh@10 -- # set +x 00:07:33.450 ************************************ 00:07:33.450 END TEST accel_decomp_mthread 00:07:33.450 ************************************ 00:07:33.450 10:38:49 -- accel/accel.sh@114 -- # run_test accel_deomp_full_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:33.450 10:38:49 -- common/autotest_common.sh@1077 -- # '[' 13 -le 1 ']' 00:07:33.450 10:38:49 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:33.450 10:38:49 -- common/autotest_common.sh@10 -- # set +x 00:07:33.450 ************************************ 00:07:33.450 START TEST accel_deomp_full_mthread 00:07:33.450 ************************************ 00:07:33.450 10:38:49 -- common/autotest_common.sh@1104 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:33.450 10:38:49 -- accel/accel.sh@16 -- # local accel_opc 00:07:33.450 10:38:49 -- accel/accel.sh@17 -- # local accel_module 00:07:33.450 10:38:49 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:33.450 10:38:49 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:33.450 10:38:49 -- accel/accel.sh@12 -- # build_accel_config 00:07:33.450 10:38:49 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:33.450 10:38:49 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:33.450 10:38:49 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:33.450 10:38:49 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:33.710 10:38:49 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:33.710 10:38:49 -- accel/accel.sh@41 -- # local IFS=, 00:07:33.710 10:38:49 -- accel/accel.sh@42 -- # jq -r . 00:07:33.710 [2024-07-13 10:38:49.851702] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:07:33.710 [2024-07-13 10:38:49.851792] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1987463 ] 00:07:33.710 EAL: No free 2048 kB hugepages reported on node 1 00:07:33.710 [2024-07-13 10:38:49.920582] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:33.710 [2024-07-13 10:38:49.957601] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:35.087 10:38:51 -- accel/accel.sh@18 -- # out='Preparing input file... 00:07:35.087 00:07:35.087 SPDK Configuration: 00:07:35.087 Core mask: 0x1 00:07:35.087 00:07:35.087 Accel Perf Configuration: 00:07:35.087 Workload Type: decompress 00:07:35.087 Transfer size: 111250 bytes 00:07:35.087 Vector count 1 00:07:35.087 Module: software 00:07:35.087 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:35.087 Queue depth: 32 00:07:35.087 Allocate depth: 32 00:07:35.087 # threads/core: 2 00:07:35.087 Run time: 1 seconds 00:07:35.087 Verify: Yes 00:07:35.087 00:07:35.087 Running for 1 seconds... 00:07:35.087 00:07:35.087 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:35.087 ------------------------------------------------------------------------------------ 00:07:35.087 0,1 2976/s 122 MiB/s 0 0 00:07:35.087 0,0 2912/s 120 MiB/s 0 0 00:07:35.087 ==================================================================================== 00:07:35.087 Total 5888/s 624 MiB/s 0 0' 00:07:35.087 10:38:51 -- accel/accel.sh@20 -- # IFS=: 00:07:35.087 10:38:51 -- accel/accel.sh@20 -- # read -r var val 00:07:35.087 10:38:51 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:35.087 10:38:51 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:35.087 10:38:51 -- accel/accel.sh@12 -- # build_accel_config 00:07:35.087 10:38:51 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:35.087 10:38:51 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:35.087 10:38:51 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:35.087 10:38:51 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:35.087 10:38:51 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:35.087 10:38:51 -- accel/accel.sh@41 -- # local IFS=, 00:07:35.087 10:38:51 -- accel/accel.sh@42 -- # jq -r . 00:07:35.087 [2024-07-13 10:38:51.165376] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:07:35.087 [2024-07-13 10:38:51.165476] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1987715 ] 00:07:35.087 EAL: No free 2048 kB hugepages reported on node 1 00:07:35.087 [2024-07-13 10:38:51.236049] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:35.087 [2024-07-13 10:38:51.270176] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:35.087 10:38:51 -- accel/accel.sh@21 -- # val= 00:07:35.087 10:38:51 -- accel/accel.sh@22 -- # case "$var" in 00:07:35.087 10:38:51 -- accel/accel.sh@20 -- # IFS=: 00:07:35.087 10:38:51 -- accel/accel.sh@20 -- # read -r var val 00:07:35.087 10:38:51 -- accel/accel.sh@21 -- # val= 00:07:35.087 10:38:51 -- accel/accel.sh@22 -- # case "$var" in 00:07:35.087 10:38:51 -- accel/accel.sh@20 -- # IFS=: 00:07:35.087 10:38:51 -- accel/accel.sh@20 -- # read -r var val 00:07:35.087 10:38:51 -- accel/accel.sh@21 -- # val= 00:07:35.087 10:38:51 -- accel/accel.sh@22 -- # case "$var" in 00:07:35.087 10:38:51 -- accel/accel.sh@20 -- # IFS=: 00:07:35.087 10:38:51 -- accel/accel.sh@20 -- # read -r var val 00:07:35.087 10:38:51 -- accel/accel.sh@21 -- # val=0x1 00:07:35.088 10:38:51 -- accel/accel.sh@22 -- # case "$var" in 00:07:35.088 10:38:51 -- accel/accel.sh@20 -- # IFS=: 00:07:35.088 10:38:51 -- accel/accel.sh@20 -- # read -r var val 00:07:35.088 10:38:51 -- accel/accel.sh@21 -- # val= 00:07:35.088 10:38:51 -- accel/accel.sh@22 -- # case "$var" in 00:07:35.088 10:38:51 -- accel/accel.sh@20 -- # IFS=: 00:07:35.088 10:38:51 -- accel/accel.sh@20 -- # read -r var val 00:07:35.088 10:38:51 -- accel/accel.sh@21 -- # val= 00:07:35.088 10:38:51 -- accel/accel.sh@22 -- # case "$var" in 00:07:35.088 10:38:51 -- accel/accel.sh@20 -- # IFS=: 00:07:35.088 10:38:51 -- accel/accel.sh@20 -- # read -r var val 00:07:35.088 10:38:51 -- accel/accel.sh@21 -- # val=decompress 00:07:35.088 10:38:51 -- accel/accel.sh@22 -- # case "$var" in 00:07:35.088 10:38:51 -- accel/accel.sh@24 -- # accel_opc=decompress 00:07:35.088 10:38:51 -- accel/accel.sh@20 -- # IFS=: 00:07:35.088 10:38:51 -- accel/accel.sh@20 -- # read -r var val 00:07:35.088 10:38:51 -- accel/accel.sh@21 -- # val='111250 bytes' 00:07:35.088 10:38:51 -- accel/accel.sh@22 -- # case "$var" in 00:07:35.088 10:38:51 -- accel/accel.sh@20 -- # IFS=: 00:07:35.088 10:38:51 -- accel/accel.sh@20 -- # read -r var val 00:07:35.088 10:38:51 -- accel/accel.sh@21 -- # val= 00:07:35.088 10:38:51 -- accel/accel.sh@22 -- # case "$var" in 00:07:35.088 10:38:51 -- accel/accel.sh@20 -- # IFS=: 00:07:35.088 10:38:51 -- accel/accel.sh@20 -- # read -r var val 00:07:35.088 10:38:51 -- accel/accel.sh@21 -- # val=software 00:07:35.088 10:38:51 -- accel/accel.sh@22 -- # case "$var" in 00:07:35.088 10:38:51 -- accel/accel.sh@23 -- # accel_module=software 00:07:35.088 10:38:51 -- accel/accel.sh@20 -- # IFS=: 00:07:35.088 10:38:51 -- accel/accel.sh@20 -- # read -r var val 00:07:35.088 10:38:51 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:35.088 10:38:51 -- accel/accel.sh@22 -- # case "$var" in 00:07:35.088 10:38:51 -- accel/accel.sh@20 -- # IFS=: 00:07:35.088 10:38:51 -- accel/accel.sh@20 -- # read -r var val 00:07:35.088 10:38:51 -- accel/accel.sh@21 -- # val=32 00:07:35.088 10:38:51 -- accel/accel.sh@22 -- # case "$var" in 00:07:35.088 10:38:51 -- accel/accel.sh@20 -- # IFS=: 00:07:35.088 10:38:51 -- accel/accel.sh@20 -- # read -r var val 00:07:35.088 10:38:51 -- accel/accel.sh@21 -- # val=32 00:07:35.088 10:38:51 -- accel/accel.sh@22 -- # case "$var" in 00:07:35.088 10:38:51 -- accel/accel.sh@20 -- # IFS=: 00:07:35.088 10:38:51 -- accel/accel.sh@20 -- # read -r var val 00:07:35.088 10:38:51 -- accel/accel.sh@21 -- # val=2 00:07:35.088 10:38:51 -- accel/accel.sh@22 -- # case "$var" in 00:07:35.088 10:38:51 -- accel/accel.sh@20 -- # IFS=: 00:07:35.088 10:38:51 -- accel/accel.sh@20 -- # read -r var val 00:07:35.088 10:38:51 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:35.088 10:38:51 -- accel/accel.sh@22 -- # case "$var" in 00:07:35.088 10:38:51 -- accel/accel.sh@20 -- # IFS=: 00:07:35.088 10:38:51 -- accel/accel.sh@20 -- # read -r var val 00:07:35.088 10:38:51 -- accel/accel.sh@21 -- # val=Yes 00:07:35.088 10:38:51 -- accel/accel.sh@22 -- # case "$var" in 00:07:35.088 10:38:51 -- accel/accel.sh@20 -- # IFS=: 00:07:35.088 10:38:51 -- accel/accel.sh@20 -- # read -r var val 00:07:35.088 10:38:51 -- accel/accel.sh@21 -- # val= 00:07:35.088 10:38:51 -- accel/accel.sh@22 -- # case "$var" in 00:07:35.088 10:38:51 -- accel/accel.sh@20 -- # IFS=: 00:07:35.088 10:38:51 -- accel/accel.sh@20 -- # read -r var val 00:07:35.088 10:38:51 -- accel/accel.sh@21 -- # val= 00:07:35.088 10:38:51 -- accel/accel.sh@22 -- # case "$var" in 00:07:35.088 10:38:51 -- accel/accel.sh@20 -- # IFS=: 00:07:35.088 10:38:51 -- accel/accel.sh@20 -- # read -r var val 00:07:36.463 10:38:52 -- accel/accel.sh@21 -- # val= 00:07:36.463 10:38:52 -- accel/accel.sh@22 -- # case "$var" in 00:07:36.463 10:38:52 -- accel/accel.sh@20 -- # IFS=: 00:07:36.463 10:38:52 -- accel/accel.sh@20 -- # read -r var val 00:07:36.463 10:38:52 -- accel/accel.sh@21 -- # val= 00:07:36.463 10:38:52 -- accel/accel.sh@22 -- # case "$var" in 00:07:36.463 10:38:52 -- accel/accel.sh@20 -- # IFS=: 00:07:36.463 10:38:52 -- accel/accel.sh@20 -- # read -r var val 00:07:36.463 10:38:52 -- accel/accel.sh@21 -- # val= 00:07:36.463 10:38:52 -- accel/accel.sh@22 -- # case "$var" in 00:07:36.463 10:38:52 -- accel/accel.sh@20 -- # IFS=: 00:07:36.463 10:38:52 -- accel/accel.sh@20 -- # read -r var val 00:07:36.463 10:38:52 -- accel/accel.sh@21 -- # val= 00:07:36.463 10:38:52 -- accel/accel.sh@22 -- # case "$var" in 00:07:36.463 10:38:52 -- accel/accel.sh@20 -- # IFS=: 00:07:36.463 10:38:52 -- accel/accel.sh@20 -- # read -r var val 00:07:36.463 10:38:52 -- accel/accel.sh@21 -- # val= 00:07:36.463 10:38:52 -- accel/accel.sh@22 -- # case "$var" in 00:07:36.463 10:38:52 -- accel/accel.sh@20 -- # IFS=: 00:07:36.463 10:38:52 -- accel/accel.sh@20 -- # read -r var val 00:07:36.463 10:38:52 -- accel/accel.sh@21 -- # val= 00:07:36.463 10:38:52 -- accel/accel.sh@22 -- # case "$var" in 00:07:36.463 10:38:52 -- accel/accel.sh@20 -- # IFS=: 00:07:36.463 10:38:52 -- accel/accel.sh@20 -- # read -r var val 00:07:36.463 10:38:52 -- accel/accel.sh@21 -- # val= 00:07:36.463 10:38:52 -- accel/accel.sh@22 -- # case "$var" in 00:07:36.463 10:38:52 -- accel/accel.sh@20 -- # IFS=: 00:07:36.463 10:38:52 -- accel/accel.sh@20 -- # read -r var val 00:07:36.463 10:38:52 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:36.463 10:38:52 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:07:36.463 10:38:52 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:36.463 00:07:36.463 real 0m2.633s 00:07:36.463 user 0m2.373s 00:07:36.463 sys 0m0.266s 00:07:36.463 10:38:52 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:36.463 10:38:52 -- common/autotest_common.sh@10 -- # set +x 00:07:36.463 ************************************ 00:07:36.463 END TEST accel_deomp_full_mthread 00:07:36.463 ************************************ 00:07:36.463 10:38:52 -- accel/accel.sh@116 -- # [[ n == y ]] 00:07:36.463 10:38:52 -- accel/accel.sh@129 -- # run_test accel_dif_functional_tests /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:07:36.463 10:38:52 -- accel/accel.sh@129 -- # build_accel_config 00:07:36.463 10:38:52 -- common/autotest_common.sh@1077 -- # '[' 4 -le 1 ']' 00:07:36.463 10:38:52 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:36.463 10:38:52 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:36.463 10:38:52 -- common/autotest_common.sh@10 -- # set +x 00:07:36.463 10:38:52 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:36.463 10:38:52 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:36.463 10:38:52 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:36.463 10:38:52 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:36.463 10:38:52 -- accel/accel.sh@41 -- # local IFS=, 00:07:36.463 10:38:52 -- accel/accel.sh@42 -- # jq -r . 00:07:36.463 ************************************ 00:07:36.463 START TEST accel_dif_functional_tests 00:07:36.463 ************************************ 00:07:36.463 10:38:52 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:07:36.463 [2024-07-13 10:38:52.538284] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:07:36.463 [2024-07-13 10:38:52.538367] [ DPDK EAL parameters: DIF --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1987905 ] 00:07:36.463 EAL: No free 2048 kB hugepages reported on node 1 00:07:36.463 [2024-07-13 10:38:52.608826] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 3 00:07:36.463 [2024-07-13 10:38:52.645683] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:36.463 [2024-07-13 10:38:52.645787] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:07:36.463 [2024-07-13 10:38:52.645789] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:36.463 00:07:36.463 00:07:36.463 CUnit - A unit testing framework for C - Version 2.1-3 00:07:36.463 http://cunit.sourceforge.net/ 00:07:36.463 00:07:36.463 00:07:36.463 Suite: accel_dif 00:07:36.463 Test: verify: DIF generated, GUARD check ...passed 00:07:36.463 Test: verify: DIF generated, APPTAG check ...passed 00:07:36.463 Test: verify: DIF generated, REFTAG check ...passed 00:07:36.463 Test: verify: DIF not generated, GUARD check ...[2024-07-13 10:38:52.708178] dif.c: 779:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:07:36.463 [2024-07-13 10:38:52.708228] dif.c: 779:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:07:36.463 passed 00:07:36.463 Test: verify: DIF not generated, APPTAG check ...[2024-07-13 10:38:52.708263] dif.c: 794:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:07:36.463 [2024-07-13 10:38:52.708281] dif.c: 794:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:07:36.463 passed 00:07:36.463 Test: verify: DIF not generated, REFTAG check ...[2024-07-13 10:38:52.708302] dif.c: 815:_dif_verify: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:07:36.463 [2024-07-13 10:38:52.708320] dif.c: 815:_dif_verify: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:07:36.463 passed 00:07:36.463 Test: verify: APPTAG correct, APPTAG check ...passed 00:07:36.463 Test: verify: APPTAG incorrect, APPTAG check ...[2024-07-13 10:38:52.708361] dif.c: 794:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=30, Expected=28, Actual=14 00:07:36.463 passed 00:07:36.463 Test: verify: APPTAG incorrect, no APPTAG check ...passed 00:07:36.463 Test: verify: REFTAG incorrect, REFTAG ignore ...passed 00:07:36.463 Test: verify: REFTAG_INIT correct, REFTAG check ...passed 00:07:36.463 Test: verify: REFTAG_INIT incorrect, REFTAG check ...[2024-07-13 10:38:52.708461] dif.c: 815:_dif_verify: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=10 00:07:36.463 passed 00:07:36.463 Test: generate copy: DIF generated, GUARD check ...passed 00:07:36.463 Test: generate copy: DIF generated, APTTAG check ...passed 00:07:36.463 Test: generate copy: DIF generated, REFTAG check ...passed 00:07:36.463 Test: generate copy: DIF generated, no GUARD check flag set ...passed 00:07:36.463 Test: generate copy: DIF generated, no APPTAG check flag set ...passed 00:07:36.463 Test: generate copy: DIF generated, no REFTAG check flag set ...passed 00:07:36.463 Test: generate copy: iovecs-len validate ...[2024-07-13 10:38:52.708633] dif.c:1167:spdk_dif_generate_copy: *ERROR*: Size of bounce_iovs arrays are not valid or misaligned with block_size. 00:07:36.463 passed 00:07:36.463 Test: generate copy: buffer alignment validate ...passed 00:07:36.463 00:07:36.463 Run Summary: Type Total Ran Passed Failed Inactive 00:07:36.463 suites 1 1 n/a 0 0 00:07:36.463 tests 20 20 20 0 0 00:07:36.464 asserts 204 204 204 0 n/a 00:07:36.464 00:07:36.464 Elapsed time = 0.002 seconds 00:07:36.723 00:07:36.723 real 0m0.344s 00:07:36.723 user 0m0.530s 00:07:36.723 sys 0m0.156s 00:07:36.723 10:38:52 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:36.723 10:38:52 -- common/autotest_common.sh@10 -- # set +x 00:07:36.723 ************************************ 00:07:36.723 END TEST accel_dif_functional_tests 00:07:36.723 ************************************ 00:07:36.723 00:07:36.723 real 0m55.272s 00:07:36.723 user 1m2.898s 00:07:36.723 sys 0m7.091s 00:07:36.723 10:38:52 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:36.723 10:38:52 -- common/autotest_common.sh@10 -- # set +x 00:07:36.723 ************************************ 00:07:36.723 END TEST accel 00:07:36.723 ************************************ 00:07:36.723 10:38:52 -- spdk/autotest.sh@190 -- # run_test accel_rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel_rpc.sh 00:07:36.723 10:38:52 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:07:36.723 10:38:52 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:36.723 10:38:52 -- common/autotest_common.sh@10 -- # set +x 00:07:36.723 ************************************ 00:07:36.723 START TEST accel_rpc 00:07:36.723 ************************************ 00:07:36.723 10:38:52 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel_rpc.sh 00:07:36.723 * Looking for test storage... 00:07:36.723 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel 00:07:36.723 10:38:53 -- accel/accel_rpc.sh@11 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:07:36.723 10:38:53 -- accel/accel_rpc.sh@14 -- # spdk_tgt_pid=1988082 00:07:36.723 10:38:53 -- accel/accel_rpc.sh@15 -- # waitforlisten 1988082 00:07:36.723 10:38:53 -- common/autotest_common.sh@819 -- # '[' -z 1988082 ']' 00:07:36.723 10:38:53 -- accel/accel_rpc.sh@13 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --wait-for-rpc 00:07:36.723 10:38:53 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:36.723 10:38:53 -- common/autotest_common.sh@824 -- # local max_retries=100 00:07:36.723 10:38:53 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:36.723 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:36.723 10:38:53 -- common/autotest_common.sh@828 -- # xtrace_disable 00:07:36.723 10:38:53 -- common/autotest_common.sh@10 -- # set +x 00:07:36.723 [2024-07-13 10:38:53.067458] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:07:36.723 [2024-07-13 10:38:53.067529] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1988082 ] 00:07:36.723 EAL: No free 2048 kB hugepages reported on node 1 00:07:36.982 [2024-07-13 10:38:53.133728] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:36.982 [2024-07-13 10:38:53.173158] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:36.982 [2024-07-13 10:38:53.173281] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:36.982 10:38:53 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:07:36.982 10:38:53 -- common/autotest_common.sh@852 -- # return 0 00:07:36.982 10:38:53 -- accel/accel_rpc.sh@45 -- # [[ y == y ]] 00:07:36.982 10:38:53 -- accel/accel_rpc.sh@45 -- # [[ 0 -gt 0 ]] 00:07:36.982 10:38:53 -- accel/accel_rpc.sh@49 -- # [[ y == y ]] 00:07:36.982 10:38:53 -- accel/accel_rpc.sh@49 -- # [[ 0 -gt 0 ]] 00:07:36.982 10:38:53 -- accel/accel_rpc.sh@53 -- # run_test accel_assign_opcode accel_assign_opcode_test_suite 00:07:36.982 10:38:53 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:07:36.982 10:38:53 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:36.982 10:38:53 -- common/autotest_common.sh@10 -- # set +x 00:07:36.982 ************************************ 00:07:36.982 START TEST accel_assign_opcode 00:07:36.982 ************************************ 00:07:36.982 10:38:53 -- common/autotest_common.sh@1104 -- # accel_assign_opcode_test_suite 00:07:36.982 10:38:53 -- accel/accel_rpc.sh@38 -- # rpc_cmd accel_assign_opc -o copy -m incorrect 00:07:36.982 10:38:53 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:36.982 10:38:53 -- common/autotest_common.sh@10 -- # set +x 00:07:36.982 [2024-07-13 10:38:53.233754] accel_rpc.c: 168:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module incorrect 00:07:36.982 10:38:53 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:36.982 10:38:53 -- accel/accel_rpc.sh@40 -- # rpc_cmd accel_assign_opc -o copy -m software 00:07:36.982 10:38:53 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:36.982 10:38:53 -- common/autotest_common.sh@10 -- # set +x 00:07:36.982 [2024-07-13 10:38:53.241769] accel_rpc.c: 168:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module software 00:07:36.982 10:38:53 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:36.982 10:38:53 -- accel/accel_rpc.sh@41 -- # rpc_cmd framework_start_init 00:07:36.982 10:38:53 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:36.982 10:38:53 -- common/autotest_common.sh@10 -- # set +x 00:07:37.241 10:38:53 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:37.241 10:38:53 -- accel/accel_rpc.sh@42 -- # rpc_cmd accel_get_opc_assignments 00:07:37.241 10:38:53 -- accel/accel_rpc.sh@42 -- # jq -r .copy 00:07:37.241 10:38:53 -- accel/accel_rpc.sh@42 -- # grep software 00:07:37.241 10:38:53 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:37.241 10:38:53 -- common/autotest_common.sh@10 -- # set +x 00:07:37.241 10:38:53 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:37.241 software 00:07:37.241 00:07:37.241 real 0m0.215s 00:07:37.241 user 0m0.040s 00:07:37.241 sys 0m0.015s 00:07:37.241 10:38:53 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:37.241 10:38:53 -- common/autotest_common.sh@10 -- # set +x 00:07:37.241 ************************************ 00:07:37.241 END TEST accel_assign_opcode 00:07:37.241 ************************************ 00:07:37.241 10:38:53 -- accel/accel_rpc.sh@55 -- # killprocess 1988082 00:07:37.241 10:38:53 -- common/autotest_common.sh@926 -- # '[' -z 1988082 ']' 00:07:37.241 10:38:53 -- common/autotest_common.sh@930 -- # kill -0 1988082 00:07:37.241 10:38:53 -- common/autotest_common.sh@931 -- # uname 00:07:37.242 10:38:53 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:07:37.242 10:38:53 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 1988082 00:07:37.242 10:38:53 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:07:37.242 10:38:53 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:07:37.242 10:38:53 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 1988082' 00:07:37.242 killing process with pid 1988082 00:07:37.242 10:38:53 -- common/autotest_common.sh@945 -- # kill 1988082 00:07:37.242 10:38:53 -- common/autotest_common.sh@950 -- # wait 1988082 00:07:37.501 00:07:37.501 real 0m0.867s 00:07:37.501 user 0m0.778s 00:07:37.501 sys 0m0.408s 00:07:37.501 10:38:53 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:37.501 10:38:53 -- common/autotest_common.sh@10 -- # set +x 00:07:37.501 ************************************ 00:07:37.501 END TEST accel_rpc 00:07:37.501 ************************************ 00:07:37.501 10:38:53 -- spdk/autotest.sh@191 -- # run_test app_cmdline /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/cmdline.sh 00:07:37.501 10:38:53 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:07:37.501 10:38:53 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:37.501 10:38:53 -- common/autotest_common.sh@10 -- # set +x 00:07:37.501 ************************************ 00:07:37.501 START TEST app_cmdline 00:07:37.501 ************************************ 00:07:37.501 10:38:53 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/cmdline.sh 00:07:37.760 * Looking for test storage... 00:07:37.760 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:07:37.760 10:38:53 -- app/cmdline.sh@14 -- # trap 'killprocess $spdk_tgt_pid' EXIT 00:07:37.760 10:38:53 -- app/cmdline.sh@17 -- # spdk_tgt_pid=1988270 00:07:37.760 10:38:53 -- app/cmdline.sh@18 -- # waitforlisten 1988270 00:07:37.760 10:38:53 -- app/cmdline.sh@16 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --rpcs-allowed spdk_get_version,rpc_get_methods 00:07:37.760 10:38:53 -- common/autotest_common.sh@819 -- # '[' -z 1988270 ']' 00:07:37.760 10:38:53 -- common/autotest_common.sh@823 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:37.760 10:38:53 -- common/autotest_common.sh@824 -- # local max_retries=100 00:07:37.760 10:38:53 -- common/autotest_common.sh@826 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:37.760 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:37.760 10:38:53 -- common/autotest_common.sh@828 -- # xtrace_disable 00:07:37.760 10:38:53 -- common/autotest_common.sh@10 -- # set +x 00:07:37.760 [2024-07-13 10:38:53.999061] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:07:37.760 [2024-07-13 10:38:53.999133] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1988270 ] 00:07:37.760 EAL: No free 2048 kB hugepages reported on node 1 00:07:37.760 [2024-07-13 10:38:54.066330] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:37.760 [2024-07-13 10:38:54.103591] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:37.760 [2024-07-13 10:38:54.103713] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:38.704 10:38:54 -- common/autotest_common.sh@848 -- # (( i == 0 )) 00:07:38.704 10:38:54 -- common/autotest_common.sh@852 -- # return 0 00:07:38.704 10:38:54 -- app/cmdline.sh@20 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py spdk_get_version 00:07:38.704 { 00:07:38.704 "version": "SPDK v24.01.1-pre git sha1 4b94202c6", 00:07:38.704 "fields": { 00:07:38.704 "major": 24, 00:07:38.704 "minor": 1, 00:07:38.704 "patch": 1, 00:07:38.704 "suffix": "-pre", 00:07:38.704 "commit": "4b94202c6" 00:07:38.704 } 00:07:38.704 } 00:07:38.704 10:38:54 -- app/cmdline.sh@22 -- # expected_methods=() 00:07:38.704 10:38:54 -- app/cmdline.sh@23 -- # expected_methods+=("rpc_get_methods") 00:07:38.704 10:38:54 -- app/cmdline.sh@24 -- # expected_methods+=("spdk_get_version") 00:07:38.704 10:38:54 -- app/cmdline.sh@26 -- # methods=($(rpc_cmd rpc_get_methods | jq -r ".[]" | sort)) 00:07:38.704 10:38:54 -- app/cmdline.sh@26 -- # rpc_cmd rpc_get_methods 00:07:38.704 10:38:54 -- common/autotest_common.sh@551 -- # xtrace_disable 00:07:38.704 10:38:54 -- app/cmdline.sh@26 -- # jq -r '.[]' 00:07:38.704 10:38:54 -- common/autotest_common.sh@10 -- # set +x 00:07:38.704 10:38:54 -- app/cmdline.sh@26 -- # sort 00:07:38.704 10:38:54 -- common/autotest_common.sh@579 -- # [[ 0 == 0 ]] 00:07:38.704 10:38:55 -- app/cmdline.sh@27 -- # (( 2 == 2 )) 00:07:38.704 10:38:55 -- app/cmdline.sh@28 -- # [[ rpc_get_methods spdk_get_version == \r\p\c\_\g\e\t\_\m\e\t\h\o\d\s\ \s\p\d\k\_\g\e\t\_\v\e\r\s\i\o\n ]] 00:07:38.704 10:38:55 -- app/cmdline.sh@30 -- # NOT /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:38.704 10:38:55 -- common/autotest_common.sh@640 -- # local es=0 00:07:38.704 10:38:55 -- common/autotest_common.sh@642 -- # valid_exec_arg /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:38.704 10:38:55 -- common/autotest_common.sh@628 -- # local arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:38.704 10:38:55 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:07:38.704 10:38:55 -- common/autotest_common.sh@632 -- # type -t /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:38.704 10:38:55 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:07:38.704 10:38:55 -- common/autotest_common.sh@634 -- # type -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:38.704 10:38:55 -- common/autotest_common.sh@632 -- # case "$(type -t "$arg")" in 00:07:38.704 10:38:55 -- common/autotest_common.sh@634 -- # arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:38.704 10:38:55 -- common/autotest_common.sh@634 -- # [[ -x /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py ]] 00:07:38.704 10:38:55 -- common/autotest_common.sh@643 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:38.962 request: 00:07:38.962 { 00:07:38.962 "method": "env_dpdk_get_mem_stats", 00:07:38.962 "req_id": 1 00:07:38.962 } 00:07:38.962 Got JSON-RPC error response 00:07:38.962 response: 00:07:38.962 { 00:07:38.962 "code": -32601, 00:07:38.962 "message": "Method not found" 00:07:38.962 } 00:07:38.962 10:38:55 -- common/autotest_common.sh@643 -- # es=1 00:07:38.962 10:38:55 -- common/autotest_common.sh@651 -- # (( es > 128 )) 00:07:38.962 10:38:55 -- common/autotest_common.sh@662 -- # [[ -n '' ]] 00:07:38.962 10:38:55 -- common/autotest_common.sh@667 -- # (( !es == 0 )) 00:07:38.962 10:38:55 -- app/cmdline.sh@1 -- # killprocess 1988270 00:07:38.962 10:38:55 -- common/autotest_common.sh@926 -- # '[' -z 1988270 ']' 00:07:38.962 10:38:55 -- common/autotest_common.sh@930 -- # kill -0 1988270 00:07:38.962 10:38:55 -- common/autotest_common.sh@931 -- # uname 00:07:38.962 10:38:55 -- common/autotest_common.sh@931 -- # '[' Linux = Linux ']' 00:07:38.962 10:38:55 -- common/autotest_common.sh@932 -- # ps --no-headers -o comm= 1988270 00:07:38.962 10:38:55 -- common/autotest_common.sh@932 -- # process_name=reactor_0 00:07:38.962 10:38:55 -- common/autotest_common.sh@936 -- # '[' reactor_0 = sudo ']' 00:07:38.962 10:38:55 -- common/autotest_common.sh@944 -- # echo 'killing process with pid 1988270' 00:07:38.962 killing process with pid 1988270 00:07:38.962 10:38:55 -- common/autotest_common.sh@945 -- # kill 1988270 00:07:38.962 10:38:55 -- common/autotest_common.sh@950 -- # wait 1988270 00:07:39.221 00:07:39.221 real 0m1.639s 00:07:39.221 user 0m1.908s 00:07:39.221 sys 0m0.463s 00:07:39.221 10:38:55 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:39.221 10:38:55 -- common/autotest_common.sh@10 -- # set +x 00:07:39.221 ************************************ 00:07:39.221 END TEST app_cmdline 00:07:39.221 ************************************ 00:07:39.221 10:38:55 -- spdk/autotest.sh@192 -- # run_test version /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/version.sh 00:07:39.221 10:38:55 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:07:39.221 10:38:55 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:39.221 10:38:55 -- common/autotest_common.sh@10 -- # set +x 00:07:39.221 ************************************ 00:07:39.221 START TEST version 00:07:39.221 ************************************ 00:07:39.221 10:38:55 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/version.sh 00:07:39.480 * Looking for test storage... 00:07:39.480 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:07:39.480 10:38:55 -- app/version.sh@17 -- # get_header_version major 00:07:39.480 10:38:55 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MAJOR[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:39.480 10:38:55 -- app/version.sh@14 -- # cut -f2 00:07:39.480 10:38:55 -- app/version.sh@14 -- # tr -d '"' 00:07:39.480 10:38:55 -- app/version.sh@17 -- # major=24 00:07:39.480 10:38:55 -- app/version.sh@18 -- # get_header_version minor 00:07:39.480 10:38:55 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MINOR[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:39.480 10:38:55 -- app/version.sh@14 -- # cut -f2 00:07:39.480 10:38:55 -- app/version.sh@14 -- # tr -d '"' 00:07:39.480 10:38:55 -- app/version.sh@18 -- # minor=1 00:07:39.480 10:38:55 -- app/version.sh@19 -- # get_header_version patch 00:07:39.480 10:38:55 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_PATCH[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:39.480 10:38:55 -- app/version.sh@14 -- # cut -f2 00:07:39.480 10:38:55 -- app/version.sh@14 -- # tr -d '"' 00:07:39.480 10:38:55 -- app/version.sh@19 -- # patch=1 00:07:39.480 10:38:55 -- app/version.sh@20 -- # get_header_version suffix 00:07:39.480 10:38:55 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_SUFFIX[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:39.480 10:38:55 -- app/version.sh@14 -- # cut -f2 00:07:39.480 10:38:55 -- app/version.sh@14 -- # tr -d '"' 00:07:39.480 10:38:55 -- app/version.sh@20 -- # suffix=-pre 00:07:39.480 10:38:55 -- app/version.sh@22 -- # version=24.1 00:07:39.480 10:38:55 -- app/version.sh@25 -- # (( patch != 0 )) 00:07:39.480 10:38:55 -- app/version.sh@25 -- # version=24.1.1 00:07:39.480 10:38:55 -- app/version.sh@28 -- # version=24.1.1rc0 00:07:39.480 10:38:55 -- app/version.sh@30 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:07:39.480 10:38:55 -- app/version.sh@30 -- # python3 -c 'import spdk; print(spdk.__version__)' 00:07:39.480 10:38:55 -- app/version.sh@30 -- # py_version=24.1.1rc0 00:07:39.480 10:38:55 -- app/version.sh@31 -- # [[ 24.1.1rc0 == \2\4\.\1\.\1\r\c\0 ]] 00:07:39.480 00:07:39.480 real 0m0.161s 00:07:39.480 user 0m0.078s 00:07:39.480 sys 0m0.127s 00:07:39.480 10:38:55 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:39.480 10:38:55 -- common/autotest_common.sh@10 -- # set +x 00:07:39.480 ************************************ 00:07:39.480 END TEST version 00:07:39.480 ************************************ 00:07:39.480 10:38:55 -- spdk/autotest.sh@194 -- # '[' 0 -eq 1 ']' 00:07:39.480 10:38:55 -- spdk/autotest.sh@204 -- # uname -s 00:07:39.480 10:38:55 -- spdk/autotest.sh@204 -- # [[ Linux == Linux ]] 00:07:39.480 10:38:55 -- spdk/autotest.sh@205 -- # [[ 0 -eq 1 ]] 00:07:39.480 10:38:55 -- spdk/autotest.sh@205 -- # [[ 0 -eq 1 ]] 00:07:39.480 10:38:55 -- spdk/autotest.sh@217 -- # '[' 0 -eq 1 ']' 00:07:39.480 10:38:55 -- spdk/autotest.sh@264 -- # '[' 0 -eq 1 ']' 00:07:39.480 10:38:55 -- spdk/autotest.sh@268 -- # timing_exit lib 00:07:39.480 10:38:55 -- common/autotest_common.sh@718 -- # xtrace_disable 00:07:39.480 10:38:55 -- common/autotest_common.sh@10 -- # set +x 00:07:39.480 10:38:55 -- spdk/autotest.sh@270 -- # '[' 0 -eq 1 ']' 00:07:39.480 10:38:55 -- spdk/autotest.sh@278 -- # '[' 0 -eq 1 ']' 00:07:39.480 10:38:55 -- spdk/autotest.sh@287 -- # '[' 0 -eq 1 ']' 00:07:39.480 10:38:55 -- spdk/autotest.sh@311 -- # '[' 0 -eq 1 ']' 00:07:39.480 10:38:55 -- spdk/autotest.sh@315 -- # '[' 0 -eq 1 ']' 00:07:39.480 10:38:55 -- spdk/autotest.sh@319 -- # '[' 0 -eq 1 ']' 00:07:39.480 10:38:55 -- spdk/autotest.sh@324 -- # '[' 0 -eq 1 ']' 00:07:39.480 10:38:55 -- spdk/autotest.sh@333 -- # '[' 0 -eq 1 ']' 00:07:39.480 10:38:55 -- spdk/autotest.sh@338 -- # '[' 0 -eq 1 ']' 00:07:39.480 10:38:55 -- spdk/autotest.sh@342 -- # '[' 0 -eq 1 ']' 00:07:39.480 10:38:55 -- spdk/autotest.sh@346 -- # '[' 0 -eq 1 ']' 00:07:39.480 10:38:55 -- spdk/autotest.sh@350 -- # '[' 0 -eq 1 ']' 00:07:39.480 10:38:55 -- spdk/autotest.sh@355 -- # '[' 0 -eq 1 ']' 00:07:39.480 10:38:55 -- spdk/autotest.sh@359 -- # '[' 0 -eq 1 ']' 00:07:39.480 10:38:55 -- spdk/autotest.sh@366 -- # [[ 0 -eq 1 ]] 00:07:39.480 10:38:55 -- spdk/autotest.sh@370 -- # [[ 0 -eq 1 ]] 00:07:39.480 10:38:55 -- spdk/autotest.sh@374 -- # [[ 1 -eq 1 ]] 00:07:39.480 10:38:55 -- spdk/autotest.sh@375 -- # run_test llvm_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm.sh 00:07:39.480 10:38:55 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:07:39.480 10:38:55 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:39.480 10:38:55 -- common/autotest_common.sh@10 -- # set +x 00:07:39.480 ************************************ 00:07:39.480 START TEST llvm_fuzz 00:07:39.480 ************************************ 00:07:39.480 10:38:55 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm.sh 00:07:39.742 * Looking for test storage... 00:07:39.742 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz 00:07:39.742 10:38:55 -- fuzz/llvm.sh@11 -- # fuzzers=($(get_fuzzer_targets)) 00:07:39.742 10:38:55 -- fuzz/llvm.sh@11 -- # get_fuzzer_targets 00:07:39.742 10:38:55 -- common/autotest_common.sh@538 -- # fuzzers=() 00:07:39.742 10:38:55 -- common/autotest_common.sh@538 -- # local fuzzers 00:07:39.742 10:38:55 -- common/autotest_common.sh@540 -- # [[ -n '' ]] 00:07:39.742 10:38:55 -- common/autotest_common.sh@543 -- # fuzzers=("$rootdir/test/fuzz/llvm/"*) 00:07:39.742 10:38:55 -- common/autotest_common.sh@544 -- # fuzzers=("${fuzzers[@]##*/}") 00:07:39.742 10:38:55 -- common/autotest_common.sh@547 -- # echo 'common.sh llvm-gcov.sh nvmf vfio' 00:07:39.742 10:38:55 -- fuzz/llvm.sh@13 -- # llvm_out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm 00:07:39.742 10:38:55 -- fuzz/llvm.sh@15 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/coverage 00:07:39.742 10:38:55 -- fuzz/llvm.sh@56 -- # [[ 1 -eq 0 ]] 00:07:39.742 10:38:55 -- fuzz/llvm.sh@60 -- # for fuzzer in "${fuzzers[@]}" 00:07:39.742 10:38:55 -- fuzz/llvm.sh@61 -- # case "$fuzzer" in 00:07:39.742 10:38:55 -- fuzz/llvm.sh@60 -- # for fuzzer in "${fuzzers[@]}" 00:07:39.742 10:38:55 -- fuzz/llvm.sh@61 -- # case "$fuzzer" in 00:07:39.742 10:38:55 -- fuzz/llvm.sh@60 -- # for fuzzer in "${fuzzers[@]}" 00:07:39.742 10:38:55 -- fuzz/llvm.sh@61 -- # case "$fuzzer" in 00:07:39.742 10:38:55 -- fuzz/llvm.sh@62 -- # run_test nvmf_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/run.sh 00:07:39.742 10:38:55 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:07:39.742 10:38:55 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:07:39.742 10:38:55 -- common/autotest_common.sh@10 -- # set +x 00:07:39.742 ************************************ 00:07:39.742 START TEST nvmf_fuzz 00:07:39.742 ************************************ 00:07:39.742 10:38:55 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/run.sh 00:07:39.742 * Looking for test storage... 00:07:39.742 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:39.742 10:38:55 -- nvmf/run.sh@52 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/common.sh 00:07:39.742 10:38:55 -- setup/common.sh@6 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh 00:07:39.742 10:38:55 -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:07:39.742 10:38:55 -- common/autotest_common.sh@34 -- # set -e 00:07:39.742 10:38:55 -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:07:39.742 10:38:55 -- common/autotest_common.sh@36 -- # shopt -s extglob 00:07:39.742 10:38:55 -- common/autotest_common.sh@38 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh ]] 00:07:39.742 10:38:56 -- common/autotest_common.sh@39 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh 00:07:39.742 10:38:56 -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:07:39.742 10:38:56 -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:07:39.742 10:38:56 -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=n 00:07:39.742 10:38:56 -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:07:39.742 10:38:56 -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:07:39.742 10:38:56 -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:07:39.742 10:38:56 -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:07:39.742 10:38:56 -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:07:39.742 10:38:56 -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:07:39.742 10:38:56 -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:07:39.742 10:38:56 -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:07:39.742 10:38:56 -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:07:39.742 10:38:56 -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:07:39.742 10:38:56 -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:07:39.742 10:38:56 -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:07:39.742 10:38:56 -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:07:39.742 10:38:56 -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:07:39.742 10:38:56 -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:07:39.742 10:38:56 -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:07:39.742 10:38:56 -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:07:39.742 10:38:56 -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:07:39.742 10:38:56 -- common/build_config.sh@22 -- # CONFIG_CET=n 00:07:39.742 10:38:56 -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=n 00:07:39.742 10:38:56 -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:07:39.742 10:38:56 -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:07:39.742 10:38:56 -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:07:39.742 10:38:56 -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:07:39.742 10:38:56 -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:07:39.742 10:38:56 -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:07:39.742 10:38:56 -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:07:39.742 10:38:56 -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:07:39.742 10:38:56 -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:07:39.742 10:38:56 -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:07:39.742 10:38:56 -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB=/usr/lib64/clang/16/lib/libclang_rt.fuzzer_no_main-x86_64.a 00:07:39.742 10:38:56 -- common/build_config.sh@35 -- # CONFIG_FUZZER=y 00:07:39.742 10:38:56 -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:07:39.742 10:38:56 -- common/build_config.sh@37 -- # CONFIG_CRYPTO=n 00:07:39.742 10:38:56 -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:07:39.742 10:38:56 -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:07:39.742 10:38:56 -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:07:39.742 10:38:56 -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR=//var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:07:39.742 10:38:56 -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:07:39.742 10:38:56 -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:07:39.742 10:38:56 -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:07:39.742 10:38:56 -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:07:39.742 10:38:56 -- common/build_config.sh@46 -- # CONFIG_COVERAGE=y 00:07:39.742 10:38:56 -- common/build_config.sh@47 -- # CONFIG_RDMA=y 00:07:39.742 10:38:56 -- common/build_config.sh@48 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:07:39.742 10:38:56 -- common/build_config.sh@49 -- # CONFIG_URING_PATH= 00:07:39.742 10:38:56 -- common/build_config.sh@50 -- # CONFIG_XNVME=n 00:07:39.742 10:38:56 -- common/build_config.sh@51 -- # CONFIG_VFIO_USER=y 00:07:39.742 10:38:56 -- common/build_config.sh@52 -- # CONFIG_ARCH=native 00:07:39.742 10:38:56 -- common/build_config.sh@53 -- # CONFIG_URING_ZNS=n 00:07:39.742 10:38:56 -- common/build_config.sh@54 -- # CONFIG_WERROR=y 00:07:39.743 10:38:56 -- common/build_config.sh@55 -- # CONFIG_HAVE_LIBBSD=n 00:07:39.743 10:38:56 -- common/build_config.sh@56 -- # CONFIG_UBSAN=y 00:07:39.743 10:38:56 -- common/build_config.sh@57 -- # CONFIG_IPSEC_MB_DIR= 00:07:39.743 10:38:56 -- common/build_config.sh@58 -- # CONFIG_GOLANG=n 00:07:39.743 10:38:56 -- common/build_config.sh@59 -- # CONFIG_ISAL=y 00:07:39.743 10:38:56 -- common/build_config.sh@60 -- # CONFIG_IDXD_KERNEL=y 00:07:39.743 10:38:56 -- common/build_config.sh@61 -- # CONFIG_DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:07:39.743 10:38:56 -- common/build_config.sh@62 -- # CONFIG_RDMA_PROV=verbs 00:07:39.743 10:38:56 -- common/build_config.sh@63 -- # CONFIG_APPS=y 00:07:39.743 10:38:56 -- common/build_config.sh@64 -- # CONFIG_SHARED=n 00:07:39.743 10:38:56 -- common/build_config.sh@65 -- # CONFIG_FC_PATH= 00:07:39.743 10:38:56 -- common/build_config.sh@66 -- # CONFIG_DPDK_PKG_CONFIG=n 00:07:39.743 10:38:56 -- common/build_config.sh@67 -- # CONFIG_FC=n 00:07:39.743 10:38:56 -- common/build_config.sh@68 -- # CONFIG_AVAHI=n 00:07:39.743 10:38:56 -- common/build_config.sh@69 -- # CONFIG_FIO_PLUGIN=y 00:07:39.743 10:38:56 -- common/build_config.sh@70 -- # CONFIG_RAID5F=n 00:07:39.743 10:38:56 -- common/build_config.sh@71 -- # CONFIG_EXAMPLES=y 00:07:39.743 10:38:56 -- common/build_config.sh@72 -- # CONFIG_TESTS=y 00:07:39.743 10:38:56 -- common/build_config.sh@73 -- # CONFIG_CRYPTO_MLX5=n 00:07:39.743 10:38:56 -- common/build_config.sh@74 -- # CONFIG_MAX_LCORES= 00:07:39.743 10:38:56 -- common/build_config.sh@75 -- # CONFIG_IPSEC_MB=n 00:07:39.743 10:38:56 -- common/build_config.sh@76 -- # CONFIG_DEBUG=y 00:07:39.743 10:38:56 -- common/build_config.sh@77 -- # CONFIG_DPDK_COMPRESSDEV=n 00:07:39.743 10:38:56 -- common/build_config.sh@78 -- # CONFIG_CROSS_PREFIX= 00:07:39.743 10:38:56 -- common/build_config.sh@79 -- # CONFIG_URING=n 00:07:39.743 10:38:56 -- common/autotest_common.sh@48 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:07:39.743 10:38:56 -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:07:39.743 10:38:56 -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:07:39.743 10:38:56 -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:07:39.743 10:38:56 -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:07:39.743 10:38:56 -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:07:39.743 10:38:56 -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:07:39.743 10:38:56 -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:07:39.743 10:38:56 -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:07:39.743 10:38:56 -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:07:39.743 10:38:56 -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:07:39.743 10:38:56 -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:07:39.743 10:38:56 -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:07:39.743 10:38:56 -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:07:39.743 10:38:56 -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/config.h ]] 00:07:39.743 10:38:56 -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:07:39.743 #define SPDK_CONFIG_H 00:07:39.743 #define SPDK_CONFIG_APPS 1 00:07:39.743 #define SPDK_CONFIG_ARCH native 00:07:39.743 #undef SPDK_CONFIG_ASAN 00:07:39.743 #undef SPDK_CONFIG_AVAHI 00:07:39.743 #undef SPDK_CONFIG_CET 00:07:39.743 #define SPDK_CONFIG_COVERAGE 1 00:07:39.743 #define SPDK_CONFIG_CROSS_PREFIX 00:07:39.743 #undef SPDK_CONFIG_CRYPTO 00:07:39.743 #undef SPDK_CONFIG_CRYPTO_MLX5 00:07:39.743 #undef SPDK_CONFIG_CUSTOMOCF 00:07:39.743 #undef SPDK_CONFIG_DAOS 00:07:39.743 #define SPDK_CONFIG_DAOS_DIR 00:07:39.743 #define SPDK_CONFIG_DEBUG 1 00:07:39.743 #undef SPDK_CONFIG_DPDK_COMPRESSDEV 00:07:39.743 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:07:39.743 #define SPDK_CONFIG_DPDK_INC_DIR //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:07:39.743 #define SPDK_CONFIG_DPDK_LIB_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:07:39.743 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:07:39.743 #define SPDK_CONFIG_ENV /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:07:39.743 #define SPDK_CONFIG_EXAMPLES 1 00:07:39.743 #undef SPDK_CONFIG_FC 00:07:39.743 #define SPDK_CONFIG_FC_PATH 00:07:39.743 #define SPDK_CONFIG_FIO_PLUGIN 1 00:07:39.743 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:07:39.743 #undef SPDK_CONFIG_FUSE 00:07:39.743 #define SPDK_CONFIG_FUZZER 1 00:07:39.743 #define SPDK_CONFIG_FUZZER_LIB /usr/lib64/clang/16/lib/libclang_rt.fuzzer_no_main-x86_64.a 00:07:39.743 #undef SPDK_CONFIG_GOLANG 00:07:39.743 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:07:39.743 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:07:39.743 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:07:39.743 #undef SPDK_CONFIG_HAVE_LIBBSD 00:07:39.743 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:07:39.743 #define SPDK_CONFIG_IDXD 1 00:07:39.743 #define SPDK_CONFIG_IDXD_KERNEL 1 00:07:39.743 #undef SPDK_CONFIG_IPSEC_MB 00:07:39.743 #define SPDK_CONFIG_IPSEC_MB_DIR 00:07:39.743 #define SPDK_CONFIG_ISAL 1 00:07:39.743 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:07:39.743 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:07:39.743 #define SPDK_CONFIG_LIBDIR 00:07:39.743 #undef SPDK_CONFIG_LTO 00:07:39.743 #define SPDK_CONFIG_MAX_LCORES 00:07:39.743 #define SPDK_CONFIG_NVME_CUSE 1 00:07:39.743 #undef SPDK_CONFIG_OCF 00:07:39.743 #define SPDK_CONFIG_OCF_PATH 00:07:39.743 #define SPDK_CONFIG_OPENSSL_PATH 00:07:39.743 #undef SPDK_CONFIG_PGO_CAPTURE 00:07:39.743 #undef SPDK_CONFIG_PGO_USE 00:07:39.743 #define SPDK_CONFIG_PREFIX /usr/local 00:07:39.743 #undef SPDK_CONFIG_RAID5F 00:07:39.743 #undef SPDK_CONFIG_RBD 00:07:39.743 #define SPDK_CONFIG_RDMA 1 00:07:39.743 #define SPDK_CONFIG_RDMA_PROV verbs 00:07:39.743 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:07:39.743 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:07:39.743 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:07:39.743 #undef SPDK_CONFIG_SHARED 00:07:39.743 #undef SPDK_CONFIG_SMA 00:07:39.743 #define SPDK_CONFIG_TESTS 1 00:07:39.743 #undef SPDK_CONFIG_TSAN 00:07:39.743 #define SPDK_CONFIG_UBLK 1 00:07:39.743 #define SPDK_CONFIG_UBSAN 1 00:07:39.743 #undef SPDK_CONFIG_UNIT_TESTS 00:07:39.743 #undef SPDK_CONFIG_URING 00:07:39.743 #define SPDK_CONFIG_URING_PATH 00:07:39.743 #undef SPDK_CONFIG_URING_ZNS 00:07:39.743 #undef SPDK_CONFIG_USDT 00:07:39.743 #undef SPDK_CONFIG_VBDEV_COMPRESS 00:07:39.743 #undef SPDK_CONFIG_VBDEV_COMPRESS_MLX5 00:07:39.743 #define SPDK_CONFIG_VFIO_USER 1 00:07:39.743 #define SPDK_CONFIG_VFIO_USER_DIR 00:07:39.743 #define SPDK_CONFIG_VHOST 1 00:07:39.743 #define SPDK_CONFIG_VIRTIO 1 00:07:39.743 #undef SPDK_CONFIG_VTUNE 00:07:39.743 #define SPDK_CONFIG_VTUNE_DIR 00:07:39.743 #define SPDK_CONFIG_WERROR 1 00:07:39.743 #define SPDK_CONFIG_WPDK_DIR 00:07:39.743 #undef SPDK_CONFIG_XNVME 00:07:39.743 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:07:39.743 10:38:56 -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:07:39.743 10:38:56 -- common/autotest_common.sh@49 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:07:39.743 10:38:56 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:07:39.743 10:38:56 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:07:39.743 10:38:56 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:07:39.743 10:38:56 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:39.743 10:38:56 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:39.743 10:38:56 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:39.743 10:38:56 -- paths/export.sh@5 -- # export PATH 00:07:39.743 10:38:56 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:39.743 10:38:56 -- common/autotest_common.sh@50 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:07:39.743 10:38:56 -- pm/common@6 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:07:39.743 10:38:56 -- pm/common@6 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:07:39.743 10:38:56 -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:07:39.743 10:38:56 -- pm/common@7 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/../../../ 00:07:39.743 10:38:56 -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:07:39.743 10:38:56 -- pm/common@16 -- # TEST_TAG=N/A 00:07:39.743 10:38:56 -- pm/common@17 -- # TEST_TAG_FILE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.run_test_name 00:07:39.743 10:38:56 -- common/autotest_common.sh@52 -- # : 1 00:07:39.743 10:38:56 -- common/autotest_common.sh@53 -- # export RUN_NIGHTLY 00:07:39.743 10:38:56 -- common/autotest_common.sh@56 -- # : 0 00:07:39.743 10:38:56 -- common/autotest_common.sh@57 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:07:39.743 10:38:56 -- common/autotest_common.sh@58 -- # : 0 00:07:39.743 10:38:56 -- common/autotest_common.sh@59 -- # export SPDK_RUN_VALGRIND 00:07:39.743 10:38:56 -- common/autotest_common.sh@60 -- # : 1 00:07:39.743 10:38:56 -- common/autotest_common.sh@61 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:07:39.743 10:38:56 -- common/autotest_common.sh@62 -- # : 0 00:07:39.743 10:38:56 -- common/autotest_common.sh@63 -- # export SPDK_TEST_UNITTEST 00:07:39.743 10:38:56 -- common/autotest_common.sh@64 -- # : 00:07:39.743 10:38:56 -- common/autotest_common.sh@65 -- # export SPDK_TEST_AUTOBUILD 00:07:39.743 10:38:56 -- common/autotest_common.sh@66 -- # : 0 00:07:39.743 10:38:56 -- common/autotest_common.sh@67 -- # export SPDK_TEST_RELEASE_BUILD 00:07:39.743 10:38:56 -- common/autotest_common.sh@68 -- # : 0 00:07:39.743 10:38:56 -- common/autotest_common.sh@69 -- # export SPDK_TEST_ISAL 00:07:39.743 10:38:56 -- common/autotest_common.sh@70 -- # : 0 00:07:39.743 10:38:56 -- common/autotest_common.sh@71 -- # export SPDK_TEST_ISCSI 00:07:39.743 10:38:56 -- common/autotest_common.sh@72 -- # : 0 00:07:39.744 10:38:56 -- common/autotest_common.sh@73 -- # export SPDK_TEST_ISCSI_INITIATOR 00:07:39.744 10:38:56 -- common/autotest_common.sh@74 -- # : 0 00:07:39.744 10:38:56 -- common/autotest_common.sh@75 -- # export SPDK_TEST_NVME 00:07:39.744 10:38:56 -- common/autotest_common.sh@76 -- # : 0 00:07:39.744 10:38:56 -- common/autotest_common.sh@77 -- # export SPDK_TEST_NVME_PMR 00:07:39.744 10:38:56 -- common/autotest_common.sh@78 -- # : 0 00:07:39.744 10:38:56 -- common/autotest_common.sh@79 -- # export SPDK_TEST_NVME_BP 00:07:39.744 10:38:56 -- common/autotest_common.sh@80 -- # : 0 00:07:39.744 10:38:56 -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME_CLI 00:07:39.744 10:38:56 -- common/autotest_common.sh@82 -- # : 0 00:07:39.744 10:38:56 -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_CUSE 00:07:39.744 10:38:56 -- common/autotest_common.sh@84 -- # : 0 00:07:39.744 10:38:56 -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_FDP 00:07:39.744 10:38:56 -- common/autotest_common.sh@86 -- # : 0 00:07:39.744 10:38:56 -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVMF 00:07:39.744 10:38:56 -- common/autotest_common.sh@88 -- # : 0 00:07:39.744 10:38:56 -- common/autotest_common.sh@89 -- # export SPDK_TEST_VFIOUSER 00:07:39.744 10:38:56 -- common/autotest_common.sh@90 -- # : 0 00:07:39.744 10:38:56 -- common/autotest_common.sh@91 -- # export SPDK_TEST_VFIOUSER_QEMU 00:07:39.744 10:38:56 -- common/autotest_common.sh@92 -- # : 1 00:07:39.744 10:38:56 -- common/autotest_common.sh@93 -- # export SPDK_TEST_FUZZER 00:07:39.744 10:38:56 -- common/autotest_common.sh@94 -- # : 1 00:07:39.744 10:38:56 -- common/autotest_common.sh@95 -- # export SPDK_TEST_FUZZER_SHORT 00:07:39.744 10:38:56 -- common/autotest_common.sh@96 -- # : rdma 00:07:39.744 10:38:56 -- common/autotest_common.sh@97 -- # export SPDK_TEST_NVMF_TRANSPORT 00:07:39.744 10:38:56 -- common/autotest_common.sh@98 -- # : 0 00:07:39.744 10:38:56 -- common/autotest_common.sh@99 -- # export SPDK_TEST_RBD 00:07:39.744 10:38:56 -- common/autotest_common.sh@100 -- # : 0 00:07:39.744 10:38:56 -- common/autotest_common.sh@101 -- # export SPDK_TEST_VHOST 00:07:39.744 10:38:56 -- common/autotest_common.sh@102 -- # : 0 00:07:39.744 10:38:56 -- common/autotest_common.sh@103 -- # export SPDK_TEST_BLOCKDEV 00:07:39.744 10:38:56 -- common/autotest_common.sh@104 -- # : 0 00:07:39.744 10:38:56 -- common/autotest_common.sh@105 -- # export SPDK_TEST_IOAT 00:07:39.744 10:38:56 -- common/autotest_common.sh@106 -- # : 0 00:07:39.744 10:38:56 -- common/autotest_common.sh@107 -- # export SPDK_TEST_BLOBFS 00:07:39.744 10:38:56 -- common/autotest_common.sh@108 -- # : 0 00:07:39.744 10:38:56 -- common/autotest_common.sh@109 -- # export SPDK_TEST_VHOST_INIT 00:07:39.744 10:38:56 -- common/autotest_common.sh@110 -- # : 0 00:07:39.744 10:38:56 -- common/autotest_common.sh@111 -- # export SPDK_TEST_LVOL 00:07:39.744 10:38:56 -- common/autotest_common.sh@112 -- # : 0 00:07:39.744 10:38:56 -- common/autotest_common.sh@113 -- # export SPDK_TEST_VBDEV_COMPRESS 00:07:39.744 10:38:56 -- common/autotest_common.sh@114 -- # : 0 00:07:39.744 10:38:56 -- common/autotest_common.sh@115 -- # export SPDK_RUN_ASAN 00:07:39.744 10:38:56 -- common/autotest_common.sh@116 -- # : 1 00:07:39.744 10:38:56 -- common/autotest_common.sh@117 -- # export SPDK_RUN_UBSAN 00:07:39.744 10:38:56 -- common/autotest_common.sh@118 -- # : /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:07:39.744 10:38:56 -- common/autotest_common.sh@119 -- # export SPDK_RUN_EXTERNAL_DPDK 00:07:39.744 10:38:56 -- common/autotest_common.sh@120 -- # : 0 00:07:39.744 10:38:56 -- common/autotest_common.sh@121 -- # export SPDK_RUN_NON_ROOT 00:07:39.744 10:38:56 -- common/autotest_common.sh@122 -- # : 0 00:07:39.744 10:38:56 -- common/autotest_common.sh@123 -- # export SPDK_TEST_CRYPTO 00:07:39.744 10:38:56 -- common/autotest_common.sh@124 -- # : 0 00:07:39.744 10:38:56 -- common/autotest_common.sh@125 -- # export SPDK_TEST_FTL 00:07:39.744 10:38:56 -- common/autotest_common.sh@126 -- # : 0 00:07:39.744 10:38:56 -- common/autotest_common.sh@127 -- # export SPDK_TEST_OCF 00:07:39.744 10:38:56 -- common/autotest_common.sh@128 -- # : 0 00:07:39.744 10:38:56 -- common/autotest_common.sh@129 -- # export SPDK_TEST_VMD 00:07:39.744 10:38:56 -- common/autotest_common.sh@130 -- # : 0 00:07:39.744 10:38:56 -- common/autotest_common.sh@131 -- # export SPDK_TEST_OPAL 00:07:39.744 10:38:56 -- common/autotest_common.sh@132 -- # : v23.11 00:07:39.744 10:38:56 -- common/autotest_common.sh@133 -- # export SPDK_TEST_NATIVE_DPDK 00:07:39.744 10:38:56 -- common/autotest_common.sh@134 -- # : true 00:07:39.744 10:38:56 -- common/autotest_common.sh@135 -- # export SPDK_AUTOTEST_X 00:07:39.744 10:38:56 -- common/autotest_common.sh@136 -- # : 0 00:07:39.744 10:38:56 -- common/autotest_common.sh@137 -- # export SPDK_TEST_RAID5 00:07:39.744 10:38:56 -- common/autotest_common.sh@138 -- # : 0 00:07:39.744 10:38:56 -- common/autotest_common.sh@139 -- # export SPDK_TEST_URING 00:07:39.744 10:38:56 -- common/autotest_common.sh@140 -- # : 0 00:07:39.744 10:38:56 -- common/autotest_common.sh@141 -- # export SPDK_TEST_USDT 00:07:39.744 10:38:56 -- common/autotest_common.sh@142 -- # : 0 00:07:39.744 10:38:56 -- common/autotest_common.sh@143 -- # export SPDK_TEST_USE_IGB_UIO 00:07:39.744 10:38:56 -- common/autotest_common.sh@144 -- # : 0 00:07:39.744 10:38:56 -- common/autotest_common.sh@145 -- # export SPDK_TEST_SCHEDULER 00:07:39.744 10:38:56 -- common/autotest_common.sh@146 -- # : 0 00:07:39.744 10:38:56 -- common/autotest_common.sh@147 -- # export SPDK_TEST_SCANBUILD 00:07:39.744 10:38:56 -- common/autotest_common.sh@148 -- # : 00:07:39.744 10:38:56 -- common/autotest_common.sh@149 -- # export SPDK_TEST_NVMF_NICS 00:07:39.744 10:38:56 -- common/autotest_common.sh@150 -- # : 0 00:07:39.744 10:38:56 -- common/autotest_common.sh@151 -- # export SPDK_TEST_SMA 00:07:39.744 10:38:56 -- common/autotest_common.sh@152 -- # : 0 00:07:39.744 10:38:56 -- common/autotest_common.sh@153 -- # export SPDK_TEST_DAOS 00:07:39.744 10:38:56 -- common/autotest_common.sh@154 -- # : 0 00:07:39.744 10:38:56 -- common/autotest_common.sh@155 -- # export SPDK_TEST_XNVME 00:07:39.744 10:38:56 -- common/autotest_common.sh@156 -- # : 0 00:07:39.744 10:38:56 -- common/autotest_common.sh@157 -- # export SPDK_TEST_ACCEL_DSA 00:07:39.744 10:38:56 -- common/autotest_common.sh@158 -- # : 0 00:07:39.744 10:38:56 -- common/autotest_common.sh@159 -- # export SPDK_TEST_ACCEL_IAA 00:07:39.744 10:38:56 -- common/autotest_common.sh@160 -- # : 0 00:07:39.744 10:38:56 -- common/autotest_common.sh@161 -- # export SPDK_TEST_ACCEL_IOAT 00:07:39.744 10:38:56 -- common/autotest_common.sh@163 -- # : 00:07:39.744 10:38:56 -- common/autotest_common.sh@164 -- # export SPDK_TEST_FUZZER_TARGET 00:07:39.744 10:38:56 -- common/autotest_common.sh@165 -- # : 0 00:07:39.744 10:38:56 -- common/autotest_common.sh@166 -- # export SPDK_TEST_NVMF_MDNS 00:07:39.744 10:38:56 -- common/autotest_common.sh@167 -- # : 0 00:07:39.744 10:38:56 -- common/autotest_common.sh@168 -- # export SPDK_JSONRPC_GO_CLIENT 00:07:39.744 10:38:56 -- common/autotest_common.sh@171 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:07:39.744 10:38:56 -- common/autotest_common.sh@171 -- # SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:07:39.744 10:38:56 -- common/autotest_common.sh@172 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:07:39.744 10:38:56 -- common/autotest_common.sh@172 -- # DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:07:39.744 10:38:56 -- common/autotest_common.sh@173 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:39.744 10:38:56 -- common/autotest_common.sh@173 -- # VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:39.744 10:38:56 -- common/autotest_common.sh@174 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:39.744 10:38:56 -- common/autotest_common.sh@174 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:39.744 10:38:56 -- common/autotest_common.sh@177 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:07:39.744 10:38:56 -- common/autotest_common.sh@177 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:07:39.744 10:38:56 -- common/autotest_common.sh@181 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:07:39.744 10:38:56 -- common/autotest_common.sh@181 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:07:39.744 10:38:56 -- common/autotest_common.sh@185 -- # export PYTHONDONTWRITEBYTECODE=1 00:07:39.744 10:38:56 -- common/autotest_common.sh@185 -- # PYTHONDONTWRITEBYTECODE=1 00:07:39.744 10:38:56 -- common/autotest_common.sh@189 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:07:39.744 10:38:56 -- common/autotest_common.sh@189 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:07:39.744 10:38:56 -- common/autotest_common.sh@190 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:07:39.744 10:38:56 -- common/autotest_common.sh@190 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:07:39.744 10:38:56 -- common/autotest_common.sh@194 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:07:39.744 10:38:56 -- common/autotest_common.sh@195 -- # rm -rf /var/tmp/asan_suppression_file 00:07:39.744 10:38:56 -- common/autotest_common.sh@196 -- # cat 00:07:39.744 10:38:56 -- common/autotest_common.sh@222 -- # echo leak:libfuse3.so 00:07:39.745 10:38:56 -- common/autotest_common.sh@224 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:07:39.745 10:38:56 -- common/autotest_common.sh@224 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:07:39.745 10:38:56 -- common/autotest_common.sh@226 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:07:39.745 10:38:56 -- common/autotest_common.sh@226 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:07:39.745 10:38:56 -- common/autotest_common.sh@228 -- # '[' -z /var/spdk/dependencies ']' 00:07:39.745 10:38:56 -- common/autotest_common.sh@231 -- # export DEPENDENCY_DIR 00:07:39.745 10:38:56 -- common/autotest_common.sh@235 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:07:39.745 10:38:56 -- common/autotest_common.sh@235 -- # SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:07:39.745 10:38:56 -- common/autotest_common.sh@236 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:07:39.745 10:38:56 -- common/autotest_common.sh@236 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:07:39.745 10:38:56 -- common/autotest_common.sh@239 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:07:39.745 10:38:56 -- common/autotest_common.sh@239 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:07:39.745 10:38:56 -- common/autotest_common.sh@240 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:07:39.745 10:38:56 -- common/autotest_common.sh@240 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:07:39.745 10:38:56 -- common/autotest_common.sh@242 -- # export AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:07:39.745 10:38:56 -- common/autotest_common.sh@242 -- # AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:07:39.745 10:38:56 -- common/autotest_common.sh@245 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:07:39.745 10:38:56 -- common/autotest_common.sh@245 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:07:39.745 10:38:56 -- common/autotest_common.sh@248 -- # '[' 0 -eq 0 ']' 00:07:39.745 10:38:56 -- common/autotest_common.sh@249 -- # export valgrind= 00:07:39.745 10:38:56 -- common/autotest_common.sh@249 -- # valgrind= 00:07:39.745 10:38:56 -- common/autotest_common.sh@255 -- # uname -s 00:07:39.745 10:38:56 -- common/autotest_common.sh@255 -- # '[' Linux = Linux ']' 00:07:39.745 10:38:56 -- common/autotest_common.sh@256 -- # HUGEMEM=4096 00:07:39.745 10:38:56 -- common/autotest_common.sh@257 -- # export CLEAR_HUGE=yes 00:07:39.745 10:38:56 -- common/autotest_common.sh@257 -- # CLEAR_HUGE=yes 00:07:39.745 10:38:56 -- common/autotest_common.sh@258 -- # [[ 0 -eq 1 ]] 00:07:39.745 10:38:56 -- common/autotest_common.sh@258 -- # [[ 0 -eq 1 ]] 00:07:39.745 10:38:56 -- common/autotest_common.sh@265 -- # MAKE=make 00:07:39.745 10:38:56 -- common/autotest_common.sh@266 -- # MAKEFLAGS=-j112 00:07:39.745 10:38:56 -- common/autotest_common.sh@282 -- # export HUGEMEM=4096 00:07:39.745 10:38:56 -- common/autotest_common.sh@282 -- # HUGEMEM=4096 00:07:39.745 10:38:56 -- common/autotest_common.sh@284 -- # '[' -z /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output ']' 00:07:39.745 10:38:56 -- common/autotest_common.sh@289 -- # NO_HUGE=() 00:07:39.745 10:38:56 -- common/autotest_common.sh@290 -- # TEST_MODE= 00:07:39.745 10:38:56 -- common/autotest_common.sh@309 -- # [[ -z 1988839 ]] 00:07:39.745 10:38:56 -- common/autotest_common.sh@309 -- # kill -0 1988839 00:07:39.745 10:38:56 -- common/autotest_common.sh@1665 -- # set_test_storage 2147483648 00:07:39.745 10:38:56 -- common/autotest_common.sh@319 -- # [[ -v testdir ]] 00:07:39.745 10:38:56 -- common/autotest_common.sh@321 -- # local requested_size=2147483648 00:07:39.745 10:38:56 -- common/autotest_common.sh@322 -- # local mount target_dir 00:07:39.745 10:38:56 -- common/autotest_common.sh@324 -- # local -A mounts fss sizes avails uses 00:07:39.745 10:38:56 -- common/autotest_common.sh@325 -- # local source fs size avail mount use 00:07:39.745 10:38:56 -- common/autotest_common.sh@327 -- # local storage_fallback storage_candidates 00:07:39.745 10:38:56 -- common/autotest_common.sh@329 -- # mktemp -udt spdk.XXXXXX 00:07:39.745 10:38:56 -- common/autotest_common.sh@329 -- # storage_fallback=/tmp/spdk.I3W9F5 00:07:39.745 10:38:56 -- common/autotest_common.sh@334 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:07:39.745 10:38:56 -- common/autotest_common.sh@336 -- # [[ -n '' ]] 00:07:39.745 10:38:56 -- common/autotest_common.sh@341 -- # [[ -n '' ]] 00:07:39.745 10:38:56 -- common/autotest_common.sh@346 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf /tmp/spdk.I3W9F5/tests/nvmf /tmp/spdk.I3W9F5 00:07:39.745 10:38:56 -- common/autotest_common.sh@349 -- # requested_size=2214592512 00:07:39.745 10:38:56 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:07:39.745 10:38:56 -- common/autotest_common.sh@318 -- # df -T 00:07:39.745 10:38:56 -- common/autotest_common.sh@318 -- # grep -v Filesystem 00:07:39.745 10:38:56 -- common/autotest_common.sh@352 -- # mounts["$mount"]=spdk_devtmpfs 00:07:39.745 10:38:56 -- common/autotest_common.sh@352 -- # fss["$mount"]=devtmpfs 00:07:39.745 10:38:56 -- common/autotest_common.sh@353 -- # avails["$mount"]=67108864 00:07:39.745 10:38:56 -- common/autotest_common.sh@353 -- # sizes["$mount"]=67108864 00:07:39.745 10:38:56 -- common/autotest_common.sh@354 -- # uses["$mount"]=0 00:07:39.745 10:38:56 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:07:39.745 10:38:56 -- common/autotest_common.sh@352 -- # mounts["$mount"]=/dev/pmem0 00:07:39.745 10:38:56 -- common/autotest_common.sh@352 -- # fss["$mount"]=ext2 00:07:39.745 10:38:56 -- common/autotest_common.sh@353 -- # avails["$mount"]=954408960 00:07:39.745 10:38:56 -- common/autotest_common.sh@353 -- # sizes["$mount"]=5284429824 00:07:39.745 10:38:56 -- common/autotest_common.sh@354 -- # uses["$mount"]=4330020864 00:07:39.745 10:38:56 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:07:39.745 10:38:56 -- common/autotest_common.sh@352 -- # mounts["$mount"]=spdk_root 00:07:39.745 10:38:56 -- common/autotest_common.sh@352 -- # fss["$mount"]=overlay 00:07:39.745 10:38:56 -- common/autotest_common.sh@353 -- # avails["$mount"]=52791177216 00:07:39.745 10:38:56 -- common/autotest_common.sh@353 -- # sizes["$mount"]=61742317568 00:07:39.745 10:38:56 -- common/autotest_common.sh@354 -- # uses["$mount"]=8951140352 00:07:39.745 10:38:56 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:07:39.745 10:38:56 -- common/autotest_common.sh@352 -- # mounts["$mount"]=tmpfs 00:07:39.745 10:38:56 -- common/autotest_common.sh@352 -- # fss["$mount"]=tmpfs 00:07:39.745 10:38:56 -- common/autotest_common.sh@353 -- # avails["$mount"]=30868566016 00:07:39.745 10:38:56 -- common/autotest_common.sh@353 -- # sizes["$mount"]=30871158784 00:07:39.745 10:38:56 -- common/autotest_common.sh@354 -- # uses["$mount"]=2592768 00:07:39.745 10:38:56 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:07:39.745 10:38:56 -- common/autotest_common.sh@352 -- # mounts["$mount"]=tmpfs 00:07:39.745 10:38:56 -- common/autotest_common.sh@352 -- # fss["$mount"]=tmpfs 00:07:39.745 10:38:56 -- common/autotest_common.sh@353 -- # avails["$mount"]=12342484992 00:07:39.745 10:38:56 -- common/autotest_common.sh@353 -- # sizes["$mount"]=12348465152 00:07:39.745 10:38:56 -- common/autotest_common.sh@354 -- # uses["$mount"]=5980160 00:07:39.745 10:38:56 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:07:39.745 10:38:56 -- common/autotest_common.sh@352 -- # mounts["$mount"]=tmpfs 00:07:39.745 10:38:56 -- common/autotest_common.sh@352 -- # fss["$mount"]=tmpfs 00:07:39.745 10:38:56 -- common/autotest_common.sh@353 -- # avails["$mount"]=30870585344 00:07:39.745 10:38:56 -- common/autotest_common.sh@353 -- # sizes["$mount"]=30871158784 00:07:39.745 10:38:56 -- common/autotest_common.sh@354 -- # uses["$mount"]=573440 00:07:39.745 10:38:56 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:07:39.745 10:38:56 -- common/autotest_common.sh@352 -- # mounts["$mount"]=tmpfs 00:07:39.745 10:38:56 -- common/autotest_common.sh@352 -- # fss["$mount"]=tmpfs 00:07:39.745 10:38:56 -- common/autotest_common.sh@353 -- # avails["$mount"]=6174224384 00:07:39.745 10:38:56 -- common/autotest_common.sh@353 -- # sizes["$mount"]=6174228480 00:07:39.745 10:38:56 -- common/autotest_common.sh@354 -- # uses["$mount"]=4096 00:07:39.745 10:38:56 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:07:39.745 10:38:56 -- common/autotest_common.sh@357 -- # printf '* Looking for test storage...\n' 00:07:39.745 * Looking for test storage... 00:07:39.745 10:38:56 -- common/autotest_common.sh@359 -- # local target_space new_size 00:07:39.745 10:38:56 -- common/autotest_common.sh@360 -- # for target_dir in "${storage_candidates[@]}" 00:07:39.745 10:38:56 -- common/autotest_common.sh@363 -- # df /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:39.745 10:38:56 -- common/autotest_common.sh@363 -- # awk '$1 !~ /Filesystem/{print $6}' 00:07:40.005 10:38:56 -- common/autotest_common.sh@363 -- # mount=/ 00:07:40.005 10:38:56 -- common/autotest_common.sh@365 -- # target_space=52791177216 00:07:40.006 10:38:56 -- common/autotest_common.sh@366 -- # (( target_space == 0 || target_space < requested_size )) 00:07:40.006 10:38:56 -- common/autotest_common.sh@369 -- # (( target_space >= requested_size )) 00:07:40.006 10:38:56 -- common/autotest_common.sh@371 -- # [[ overlay == tmpfs ]] 00:07:40.006 10:38:56 -- common/autotest_common.sh@371 -- # [[ overlay == ramfs ]] 00:07:40.006 10:38:56 -- common/autotest_common.sh@371 -- # [[ / == / ]] 00:07:40.006 10:38:56 -- common/autotest_common.sh@372 -- # new_size=11165732864 00:07:40.006 10:38:56 -- common/autotest_common.sh@373 -- # (( new_size * 100 / sizes[/] > 95 )) 00:07:40.006 10:38:56 -- common/autotest_common.sh@378 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:40.006 10:38:56 -- common/autotest_common.sh@378 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:40.006 10:38:56 -- common/autotest_common.sh@379 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:40.006 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:40.006 10:38:56 -- common/autotest_common.sh@380 -- # return 0 00:07:40.006 10:38:56 -- common/autotest_common.sh@1667 -- # set -o errtrace 00:07:40.006 10:38:56 -- common/autotest_common.sh@1668 -- # shopt -s extdebug 00:07:40.006 10:38:56 -- common/autotest_common.sh@1669 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:07:40.006 10:38:56 -- common/autotest_common.sh@1671 -- # PS4=' \t -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:07:40.006 10:38:56 -- common/autotest_common.sh@1672 -- # true 00:07:40.006 10:38:56 -- common/autotest_common.sh@1674 -- # xtrace_fd 00:07:40.006 10:38:56 -- common/autotest_common.sh@25 -- # [[ -n 14 ]] 00:07:40.006 10:38:56 -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/14 ]] 00:07:40.006 10:38:56 -- common/autotest_common.sh@27 -- # exec 00:07:40.006 10:38:56 -- common/autotest_common.sh@29 -- # exec 00:07:40.006 10:38:56 -- common/autotest_common.sh@31 -- # xtrace_restore 00:07:40.006 10:38:56 -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:07:40.006 10:38:56 -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:07:40.006 10:38:56 -- common/autotest_common.sh@18 -- # set -x 00:07:40.006 10:38:56 -- nvmf/run.sh@53 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/../common.sh 00:07:40.006 10:38:56 -- ../common.sh@8 -- # pids=() 00:07:40.006 10:38:56 -- nvmf/run.sh@55 -- # fuzzfile=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c 00:07:40.006 10:38:56 -- nvmf/run.sh@56 -- # grep -c '\.fn =' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c 00:07:40.006 10:38:56 -- nvmf/run.sh@56 -- # fuzz_num=25 00:07:40.006 10:38:56 -- nvmf/run.sh@57 -- # (( fuzz_num != 0 )) 00:07:40.006 10:38:56 -- nvmf/run.sh@59 -- # trap 'cleanup /tmp/llvm_fuzz*; exit 1' SIGINT SIGTERM EXIT 00:07:40.006 10:38:56 -- nvmf/run.sh@61 -- # mem_size=512 00:07:40.006 10:38:56 -- nvmf/run.sh@62 -- # [[ 1 -eq 1 ]] 00:07:40.006 10:38:56 -- nvmf/run.sh@63 -- # start_llvm_fuzz_short 25 1 00:07:40.006 10:38:56 -- ../common.sh@69 -- # local fuzz_num=25 00:07:40.006 10:38:56 -- ../common.sh@70 -- # local time=1 00:07:40.006 10:38:56 -- ../common.sh@72 -- # (( i = 0 )) 00:07:40.006 10:38:56 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:40.006 10:38:56 -- ../common.sh@73 -- # start_llvm_fuzz 0 1 0x1 00:07:40.006 10:38:56 -- nvmf/run.sh@23 -- # local fuzzer_type=0 00:07:40.006 10:38:56 -- nvmf/run.sh@24 -- # local timen=1 00:07:40.006 10:38:56 -- nvmf/run.sh@25 -- # local core=0x1 00:07:40.006 10:38:56 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:07:40.006 10:38:56 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_0.conf 00:07:40.006 10:38:56 -- nvmf/run.sh@29 -- # printf %02d 0 00:07:40.006 10:38:56 -- nvmf/run.sh@29 -- # port=4400 00:07:40.006 10:38:56 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:07:40.006 10:38:56 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4400' 00:07:40.006 10:38:56 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4400"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:40.006 10:38:56 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4400' -c /tmp/fuzz_json_0.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 -Z 0 -r /var/tmp/spdk0.sock 00:07:40.006 [2024-07-13 10:38:56.178541] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:07:40.006 [2024-07-13 10:38:56.178596] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1988888 ] 00:07:40.006 EAL: No free 2048 kB hugepages reported on node 1 00:07:40.006 [2024-07-13 10:38:56.353116] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:40.006 [2024-07-13 10:38:56.372428] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:40.006 [2024-07-13 10:38:56.372554] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:40.265 [2024-07-13 10:38:56.424318] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:40.265 [2024-07-13 10:38:56.440631] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4400 *** 00:07:40.265 INFO: Running with entropic power schedule (0xFF, 100). 00:07:40.265 INFO: Seed: 2764218629 00:07:40.265 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x26ac74c, 0x26ffca9), 00:07:40.265 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x26ffcb0,0x2c35280), 00:07:40.265 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:07:40.265 INFO: A corpus is not provided, starting from an empty corpus 00:07:40.265 #2 INITED exec/s: 0 rss: 60Mb 00:07:40.265 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:40.265 This may also happen if the target rejected all inputs we tried so far 00:07:40.265 [2024-07-13 10:38:56.485819] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.265 [2024-07-13 10:38:56.485848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.525 NEW_FUNC[1/670]: 0x49e700 in fuzz_admin_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:47 00:07:40.525 NEW_FUNC[2/670]: 0x4dac50 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:40.525 #5 NEW cov: 11472 ft: 11472 corp: 2/112b lim: 320 exec/s: 0 rss: 68Mb L: 111/111 MS: 3 ShuffleBytes-CopyPart-InsertRepeatedBytes- 00:07:40.525 [2024-07-13 10:38:56.806764] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.525 [2024-07-13 10:38:56.806797] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.525 [2024-07-13 10:38:56.806848] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:40.525 [2024-07-13 10:38:56.806862] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.525 [2024-07-13 10:38:56.806912] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:40.525 [2024-07-13 10:38:56.806925] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:40.525 #6 NEW cov: 11608 ft: 12072 corp: 3/337b lim: 320 exec/s: 0 rss: 68Mb L: 225/225 MS: 1 InsertRepeatedBytes- 00:07:40.525 [2024-07-13 10:38:56.846922] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:1d1d1d1d SGL TRANSPORT DATA BLOCK TRANSPORT 0x1d1d1d1d1d1d1d1d 00:07:40.525 [2024-07-13 10:38:56.846950] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.526 [2024-07-13 10:38:56.847003] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:40.526 [2024-07-13 10:38:56.847018] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.526 [2024-07-13 10:38:56.847069] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:40.526 [2024-07-13 10:38:56.847082] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:40.526 [2024-07-13 10:38:56.847133] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:40.526 [2024-07-13 10:38:56.847146] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:40.526 #7 NEW cov: 11614 ft: 12478 corp: 4/604b lim: 320 exec/s: 0 rss: 68Mb L: 267/267 MS: 1 InsertRepeatedBytes- 00:07:40.526 [2024-07-13 10:38:56.886692] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.526 [2024-07-13 10:38:56.886717] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.526 #8 NEW cov: 11699 ft: 12857 corp: 5/715b lim: 320 exec/s: 0 rss: 68Mb L: 111/267 MS: 1 ShuffleBytes- 00:07:40.786 [2024-07-13 10:38:56.926787] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (d3) qid:0 cid:4 nsid:d3d3b5d3 cdw10:d3d3d3d3 cdw11:d3d3d3d3 SGL TRANSPORT DATA BLOCK TRANSPORT 0xd3d3d3d3d3d3d3d3 00:07:40.786 [2024-07-13 10:38:56.926813] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.786 NEW_FUNC[1/1]: 0x12fe060 in nvmf_tcp_req_set_cpl /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/tcp.c:2014 00:07:40.786 #13 NEW cov: 11730 ft: 13004 corp: 6/799b lim: 320 exec/s: 0 rss: 68Mb L: 84/267 MS: 5 InsertRepeatedBytes-ChangeBinInt-InsertByte-ShuffleBytes-CopyPart- 00:07:40.786 [2024-07-13 10:38:56.966941] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.786 [2024-07-13 10:38:56.966966] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.786 #14 NEW cov: 11730 ft: 13087 corp: 7/910b lim: 320 exec/s: 0 rss: 68Mb L: 111/267 MS: 1 ChangeBit- 00:07:40.786 [2024-07-13 10:38:57.007067] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.786 [2024-07-13 10:38:57.007093] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.786 #15 NEW cov: 11730 ft: 13179 corp: 8/1021b lim: 320 exec/s: 0 rss: 69Mb L: 111/267 MS: 1 ChangeByte- 00:07:40.786 [2024-07-13 10:38:57.047190] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.786 [2024-07-13 10:38:57.047216] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.786 #16 NEW cov: 11730 ft: 13217 corp: 9/1128b lim: 320 exec/s: 0 rss: 69Mb L: 107/267 MS: 1 CrossOver- 00:07:40.786 [2024-07-13 10:38:57.077224] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:40.786 [2024-07-13 10:38:57.077249] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.786 #17 NEW cov: 11730 ft: 13318 corp: 10/1214b lim: 320 exec/s: 0 rss: 69Mb L: 86/267 MS: 1 CrossOver- 00:07:40.786 [2024-07-13 10:38:57.107702] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.786 [2024-07-13 10:38:57.107728] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.786 [2024-07-13 10:38:57.107780] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:40.786 [2024-07-13 10:38:57.107794] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.786 [2024-07-13 10:38:57.107844] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:40.786 [2024-07-13 10:38:57.107857] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:40.786 [2024-07-13 10:38:57.107908] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:40.786 [2024-07-13 10:38:57.107921] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:40.786 #18 NEW cov: 11730 ft: 13424 corp: 11/1483b lim: 320 exec/s: 0 rss: 69Mb L: 269/269 MS: 1 CopyPart- 00:07:40.786 [2024-07-13 10:38:57.147528] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.786 [2024-07-13 10:38:57.147553] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.786 #19 NEW cov: 11730 ft: 13448 corp: 12/1570b lim: 320 exec/s: 0 rss: 69Mb L: 87/269 MS: 1 EraseBytes- 00:07:41.046 [2024-07-13 10:38:57.177530] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (d3) qid:0 cid:4 nsid:d3d3b5d3 cdw10:d3d3d3d3 cdw11:d3d3d3d3 SGL TRANSPORT DATA BLOCK TRANSPORT 0xd3d3d3d3d3d3d3d3 00:07:41.046 [2024-07-13 10:38:57.177557] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.046 #20 NEW cov: 11730 ft: 13463 corp: 13/1654b lim: 320 exec/s: 0 rss: 69Mb L: 84/269 MS: 1 CopyPart- 00:07:41.046 [2024-07-13 10:38:57.217715] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (2d) qid:0 cid:4 nsid:d3b5d3d3 cdw10:d3d3d3d3 cdw11:d3d3d3d3 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.046 [2024-07-13 10:38:57.217741] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.046 NEW_FUNC[1/1]: 0x16f8690 in nvme_get_sgl_unkeyed /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_qpair.c:143 00:07:41.046 #25 NEW cov: 11744 ft: 13790 corp: 14/1730b lim: 320 exec/s: 0 rss: 69Mb L: 76/269 MS: 5 ChangeByte-ChangeByte-InsertByte-ChangeBit-CrossOver- 00:07:41.046 [2024-07-13 10:38:57.257814] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00210000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.046 [2024-07-13 10:38:57.257841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.046 #26 NEW cov: 11744 ft: 13873 corp: 15/1842b lim: 320 exec/s: 0 rss: 69Mb L: 112/269 MS: 1 InsertByte- 00:07:41.046 [2024-07-13 10:38:57.297953] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:41.046 [2024-07-13 10:38:57.297978] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.046 #27 NEW cov: 11744 ft: 13893 corp: 16/1928b lim: 320 exec/s: 0 rss: 69Mb L: 86/269 MS: 1 ShuffleBytes- 00:07:41.046 [2024-07-13 10:38:57.338064] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00210000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.046 [2024-07-13 10:38:57.338090] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.046 #28 NEW cov: 11744 ft: 13936 corp: 17/2041b lim: 320 exec/s: 0 rss: 69Mb L: 113/269 MS: 1 InsertByte- 00:07:41.046 [2024-07-13 10:38:57.378262] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00210000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.046 [2024-07-13 10:38:57.378287] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.046 [2024-07-13 10:38:57.378338] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:41.046 [2024-07-13 10:38:57.378352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.046 NEW_FUNC[1/1]: 0x197bcf0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:41.046 #29 NEW cov: 11767 ft: 14084 corp: 18/2215b lim: 320 exec/s: 0 rss: 69Mb L: 174/269 MS: 1 CrossOver- 00:07:41.046 [2024-07-13 10:38:57.418302] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (2d) qid:0 cid:4 nsid:b5d3d3d3 cdw10:d3d3d3d3 cdw11:d3d3d3d3 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.046 [2024-07-13 10:38:57.418327] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.306 #30 NEW cov: 11767 ft: 14185 corp: 19/2291b lim: 320 exec/s: 0 rss: 69Mb L: 76/269 MS: 1 ShuffleBytes- 00:07:41.306 [2024-07-13 10:38:57.458775] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.306 [2024-07-13 10:38:57.458800] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.306 [2024-07-13 10:38:57.458849] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:41.306 [2024-07-13 10:38:57.458863] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.306 [2024-07-13 10:38:57.458910] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:41.306 [2024-07-13 10:38:57.458924] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.306 [2024-07-13 10:38:57.458971] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:41.307 [2024-07-13 10:38:57.458984] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:41.307 #31 NEW cov: 11767 ft: 14201 corp: 20/2560b lim: 320 exec/s: 31 rss: 70Mb L: 269/269 MS: 1 ChangeBit- 00:07:41.307 [2024-07-13 10:38:57.498544] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:41.307 [2024-07-13 10:38:57.498569] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.307 #32 NEW cov: 11767 ft: 14222 corp: 21/2647b lim: 320 exec/s: 32 rss: 70Mb L: 87/269 MS: 1 InsertByte- 00:07:41.307 [2024-07-13 10:38:57.528628] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.307 [2024-07-13 10:38:57.528652] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.307 #33 NEW cov: 11767 ft: 14275 corp: 22/2754b lim: 320 exec/s: 33 rss: 70Mb L: 107/269 MS: 1 ChangeBit- 00:07:41.307 [2024-07-13 10:38:57.568752] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:41.307 [2024-07-13 10:38:57.568776] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.307 #34 NEW cov: 11767 ft: 14308 corp: 23/2840b lim: 320 exec/s: 34 rss: 70Mb L: 86/269 MS: 1 ShuffleBytes- 00:07:41.307 [2024-07-13 10:38:57.608985] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00210000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.307 [2024-07-13 10:38:57.609012] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.307 [2024-07-13 10:38:57.609062] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:41.307 [2024-07-13 10:38:57.609076] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.307 #35 NEW cov: 11767 ft: 14330 corp: 24/3014b lim: 320 exec/s: 35 rss: 70Mb L: 174/269 MS: 1 ChangeBinInt- 00:07:41.307 [2024-07-13 10:38:57.648937] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00210000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.307 [2024-07-13 10:38:57.648962] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.307 #36 NEW cov: 11767 ft: 14340 corp: 25/3127b lim: 320 exec/s: 36 rss: 70Mb L: 113/269 MS: 1 ChangeBit- 00:07:41.307 [2024-07-13 10:38:57.689087] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.307 [2024-07-13 10:38:57.689111] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.566 #37 NEW cov: 11767 ft: 14351 corp: 26/3214b lim: 320 exec/s: 37 rss: 70Mb L: 87/269 MS: 1 ChangeByte- 00:07:41.566 [2024-07-13 10:38:57.729187] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x60000000000 00:07:41.566 [2024-07-13 10:38:57.729212] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.566 #38 NEW cov: 11767 ft: 14410 corp: 27/3333b lim: 320 exec/s: 38 rss: 70Mb L: 119/269 MS: 1 CMP- DE: "\000\000\000\000\000\000\000\006"- 00:07:41.566 [2024-07-13 10:38:57.759309] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (d3) qid:0 cid:4 nsid:d3d3b5d3 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.566 [2024-07-13 10:38:57.759334] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.566 [2024-07-13 10:38:57.759384] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:41.566 [2024-07-13 10:38:57.759397] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.566 #39 NEW cov: 11767 ft: 14460 corp: 28/3514b lim: 320 exec/s: 39 rss: 70Mb L: 181/269 MS: 1 InsertRepeatedBytes- 00:07:41.566 [2024-07-13 10:38:57.799539] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.566 [2024-07-13 10:38:57.799564] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.566 [2024-07-13 10:38:57.799614] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:41.566 [2024-07-13 10:38:57.799628] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.566 #40 NEW cov: 11767 ft: 14489 corp: 29/3661b lim: 320 exec/s: 40 rss: 70Mb L: 147/269 MS: 1 CopyPart- 00:07:41.566 [2024-07-13 10:38:57.839603] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (d3) qid:0 cid:4 nsid:d3d3b5d3 cdw10:00000000 cdw11:40000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.566 [2024-07-13 10:38:57.839627] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.566 [2024-07-13 10:38:57.839677] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:41.566 [2024-07-13 10:38:57.839690] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.566 #41 NEW cov: 11767 ft: 14502 corp: 30/3842b lim: 320 exec/s: 41 rss: 70Mb L: 181/269 MS: 1 ChangeByte- 00:07:41.567 [2024-07-13 10:38:57.879741] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00210000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.567 [2024-07-13 10:38:57.879765] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.567 [2024-07-13 10:38:57.879816] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:41.567 [2024-07-13 10:38:57.879830] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.567 #42 NEW cov: 11767 ft: 14504 corp: 31/3984b lim: 320 exec/s: 42 rss: 70Mb L: 142/269 MS: 1 CopyPart- 00:07:41.567 [2024-07-13 10:38:57.909694] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:41.567 [2024-07-13 10:38:57.909718] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.567 #43 NEW cov: 11767 ft: 14505 corp: 32/4071b lim: 320 exec/s: 43 rss: 70Mb L: 87/269 MS: 1 ChangeByte- 00:07:41.567 [2024-07-13 10:38:57.949787] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (d3) qid:0 cid:4 nsid:d3d3b5d3 cdw10:2c2c33d3 cdw11:2c2c2c2c SGL TRANSPORT DATA BLOCK TRANSPORT 0xd3d3d3d3d3d3d3d3 00:07:41.567 [2024-07-13 10:38:57.949811] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.826 #44 NEW cov: 11767 ft: 14506 corp: 33/4155b lim: 320 exec/s: 44 rss: 70Mb L: 84/269 MS: 1 ChangeBinInt- 00:07:41.826 [2024-07-13 10:38:57.990236] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:1d1d1d1d SGL TRANSPORT DATA BLOCK TRANSPORT 0x1d1d1d1d1d1d1d1d 00:07:41.826 [2024-07-13 10:38:57.990260] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.826 [2024-07-13 10:38:57.990304] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:41.826 [2024-07-13 10:38:57.990318] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.826 [2024-07-13 10:38:57.990368] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:41.826 [2024-07-13 10:38:57.990381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.826 [2024-07-13 10:38:57.990430] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:41.826 [2024-07-13 10:38:57.990446] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:41.826 #45 NEW cov: 11767 ft: 14579 corp: 34/4422b lim: 320 exec/s: 45 rss: 70Mb L: 267/269 MS: 1 CopyPart- 00:07:41.826 [2024-07-13 10:38:58.030084] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (2d) qid:0 cid:4 nsid:b5d3d3d3 cdw10:d3d3d3d3 cdw11:d3d3d3d3 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.826 [2024-07-13 10:38:58.030109] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.826 #46 NEW cov: 11767 ft: 14592 corp: 35/4498b lim: 320 exec/s: 46 rss: 70Mb L: 76/269 MS: 1 ShuffleBytes- 00:07:41.826 [2024-07-13 10:38:58.070504] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.826 [2024-07-13 10:38:58.070528] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.826 [2024-07-13 10:38:58.070582] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:41.826 [2024-07-13 10:38:58.070595] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.826 [2024-07-13 10:38:58.070649] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:41.826 [2024-07-13 10:38:58.070662] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.826 [2024-07-13 10:38:58.070711] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:41.827 [2024-07-13 10:38:58.070724] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:41.827 #47 NEW cov: 11767 ft: 14599 corp: 36/4767b lim: 320 exec/s: 47 rss: 70Mb L: 269/269 MS: 1 ShuffleBytes- 00:07:41.827 [2024-07-13 10:38:58.110591] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.827 [2024-07-13 10:38:58.110615] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.827 [2024-07-13 10:38:58.110666] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:41.827 [2024-07-13 10:38:58.110679] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.827 [2024-07-13 10:38:58.110727] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:41.827 [2024-07-13 10:38:58.110740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.827 [2024-07-13 10:38:58.110789] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:41.827 [2024-07-13 10:38:58.110802] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:41.827 #48 NEW cov: 11767 ft: 14626 corp: 37/5037b lim: 320 exec/s: 48 rss: 70Mb L: 270/270 MS: 1 InsertByte- 00:07:41.827 [2024-07-13 10:38:58.150420] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (2d) qid:0 cid:4 nsid:b5d3d3d3 cdw10:d3d3d3d3 cdw11:d3d3d3d3 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.827 [2024-07-13 10:38:58.150451] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.827 #49 NEW cov: 11767 ft: 14635 corp: 38/5113b lim: 320 exec/s: 49 rss: 70Mb L: 76/270 MS: 1 ChangeASCIIInt- 00:07:41.827 [2024-07-13 10:38:58.190813] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:41.827 [2024-07-13 10:38:58.190839] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.827 [2024-07-13 10:38:58.190908] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:41.827 [2024-07-13 10:38:58.190921] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.827 [2024-07-13 10:38:58.190972] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000600 00:07:41.827 [2024-07-13 10:38:58.190985] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.827 [2024-07-13 10:38:58.191034] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:41.827 [2024-07-13 10:38:58.191048] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:41.827 #50 NEW cov: 11767 ft: 14667 corp: 39/5382b lim: 320 exec/s: 50 rss: 70Mb L: 269/270 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\000\006"- 00:07:42.086 [2024-07-13 10:38:58.230661] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (2d) qid:0 cid:4 nsid:d3b5d3d3 cdw10:d3d3d3d3 cdw11:d3d3d3d3 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.086 [2024-07-13 10:38:58.230689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.086 #51 NEW cov: 11767 ft: 14671 corp: 40/5458b lim: 320 exec/s: 51 rss: 70Mb L: 76/270 MS: 1 ChangeASCIIInt- 00:07:42.086 [2024-07-13 10:38:58.270757] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:42.086 [2024-07-13 10:38:58.270782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.086 #52 NEW cov: 11767 ft: 14699 corp: 41/5525b lim: 320 exec/s: 52 rss: 70Mb L: 67/270 MS: 1 CrossOver- 00:07:42.086 [2024-07-13 10:38:58.311065] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:dfdfdfdf SGL TRANSPORT DATA BLOCK TRANSPORT 0xdfdfdfdfdfdfdfdf 00:07:42.086 [2024-07-13 10:38:58.311089] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.087 [2024-07-13 10:38:58.311151] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (df) qid:0 cid:5 nsid:dfdfdfdf cdw10:dfdfdfdf cdw11:000000df SGL TRANSPORT DATA BLOCK TRANSPORT 0xdfdfdfdfdfdfdfdf 00:07:42.087 [2024-07-13 10:38:58.311165] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.087 [2024-07-13 10:38:58.311215] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:42.087 [2024-07-13 10:38:58.311229] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:42.087 #53 NEW cov: 11767 ft: 14725 corp: 42/5742b lim: 320 exec/s: 53 rss: 70Mb L: 217/270 MS: 1 InsertRepeatedBytes- 00:07:42.087 [2024-07-13 10:38:58.350979] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.087 [2024-07-13 10:38:58.351005] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.087 #54 NEW cov: 11767 ft: 14738 corp: 43/5854b lim: 320 exec/s: 54 rss: 70Mb L: 112/270 MS: 1 InsertByte- 00:07:42.087 [2024-07-13 10:38:58.381082] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.087 [2024-07-13 10:38:58.381108] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.087 #55 NEW cov: 11767 ft: 14752 corp: 44/5941b lim: 320 exec/s: 55 rss: 70Mb L: 87/270 MS: 1 ChangeBit- 00:07:42.087 [2024-07-13 10:38:58.421142] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.087 [2024-07-13 10:38:58.421167] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.087 #56 NEW cov: 11767 ft: 14757 corp: 45/6006b lim: 320 exec/s: 56 rss: 71Mb L: 65/270 MS: 1 CrossOver- 00:07:42.087 [2024-07-13 10:38:58.461238] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:005e0000 cdw11:00000000 00:07:42.087 [2024-07-13 10:38:58.461264] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.346 #57 NEW cov: 11767 ft: 14784 corp: 46/6081b lim: 320 exec/s: 28 rss: 71Mb L: 75/270 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\000\006"- 00:07:42.346 #57 DONE cov: 11767 ft: 14784 corp: 46/6081b lim: 320 exec/s: 28 rss: 71Mb 00:07:42.346 ###### Recommended dictionary. ###### 00:07:42.347 "\000\000\000\000\000\000\000\006" # Uses: 2 00:07:42.347 ###### End of recommended dictionary. ###### 00:07:42.347 Done 57 runs in 2 second(s) 00:07:42.347 10:38:58 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_0.conf 00:07:42.347 10:38:58 -- ../common.sh@72 -- # (( i++ )) 00:07:42.347 10:38:58 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:42.347 10:38:58 -- ../common.sh@73 -- # start_llvm_fuzz 1 1 0x1 00:07:42.347 10:38:58 -- nvmf/run.sh@23 -- # local fuzzer_type=1 00:07:42.347 10:38:58 -- nvmf/run.sh@24 -- # local timen=1 00:07:42.347 10:38:58 -- nvmf/run.sh@25 -- # local core=0x1 00:07:42.347 10:38:58 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:07:42.347 10:38:58 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_1.conf 00:07:42.347 10:38:58 -- nvmf/run.sh@29 -- # printf %02d 1 00:07:42.347 10:38:58 -- nvmf/run.sh@29 -- # port=4401 00:07:42.347 10:38:58 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:07:42.347 10:38:58 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4401' 00:07:42.347 10:38:58 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4401"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:42.347 10:38:58 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4401' -c /tmp/fuzz_json_1.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 -Z 1 -r /var/tmp/spdk1.sock 00:07:42.347 [2024-07-13 10:38:58.628173] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:07:42.347 [2024-07-13 10:38:58.628226] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1989298 ] 00:07:42.347 EAL: No free 2048 kB hugepages reported on node 1 00:07:42.606 [2024-07-13 10:38:58.809505] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:42.606 [2024-07-13 10:38:58.829449] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:42.606 [2024-07-13 10:38:58.829590] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:42.606 [2024-07-13 10:38:58.881070] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:42.606 [2024-07-13 10:38:58.897388] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4401 *** 00:07:42.606 INFO: Running with entropic power schedule (0xFF, 100). 00:07:42.606 INFO: Seed: 924254327 00:07:42.606 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x26ac74c, 0x26ffca9), 00:07:42.606 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x26ffcb0,0x2c35280), 00:07:42.606 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:07:42.606 INFO: A corpus is not provided, starting from an empty corpus 00:07:42.606 #2 INITED exec/s: 0 rss: 60Mb 00:07:42.606 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:42.606 This may also happen if the target rejected all inputs we tried so far 00:07:42.606 [2024-07-13 10:38:58.945715] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (45060) > buf size (4096) 00:07:42.606 [2024-07-13 10:38:58.945944] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:2c000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.606 [2024-07-13 10:38:58.945973] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.866 NEW_FUNC[1/670]: 0x49f000 in fuzz_admin_get_log_page_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:67 00:07:42.866 NEW_FUNC[2/670]: 0x4dac50 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:42.866 #6 NEW cov: 11565 ft: 11571 corp: 2/10b lim: 30 exec/s: 0 rss: 69Mb L: 9/9 MS: 4 ChangeBit-CrossOver-EraseBytes-CMP- DE: ",\000\000\000\000\000\000\000"- 00:07:42.866 [2024-07-13 10:38:59.246562] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (45060) > buf size (4096) 00:07:42.866 [2024-07-13 10:38:59.246692] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:42.866 [2024-07-13 10:38:59.246806] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:42.866 [2024-07-13 10:38:59.246919] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:42.866 [2024-07-13 10:38:59.247023] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000004a 00:07:42.866 [2024-07-13 10:38:59.247240] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:2c000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.866 [2024-07-13 10:38:59.247272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.866 [2024-07-13 10:38:59.247329] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.866 [2024-07-13 10:38:59.247342] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.866 [2024-07-13 10:38:59.247396] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.866 [2024-07-13 10:38:59.247410] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:42.866 [2024-07-13 10:38:59.247464] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.867 [2024-07-13 10:38:59.247478] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:42.867 [2024-07-13 10:38:59.247531] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:8 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:42.867 [2024-07-13 10:38:59.247544] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:43.126 NEW_FUNC[1/1]: 0x1c54e00 in accel_comp_poll /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/accel/accel_sw.c:554 00:07:43.126 #7 NEW cov: 11689 ft: 12576 corp: 3/40b lim: 30 exec/s: 0 rss: 69Mb L: 30/30 MS: 1 InsertRepeatedBytes- 00:07:43.126 [2024-07-13 10:38:59.296504] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (45060) > buf size (4096) 00:07:43.126 [2024-07-13 10:38:59.296712] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:2c000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.126 [2024-07-13 10:38:59.296739] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.126 #8 NEW cov: 11695 ft: 12822 corp: 4/49b lim: 30 exec/s: 0 rss: 69Mb L: 9/30 MS: 1 ChangeByte- 00:07:43.126 [2024-07-13 10:38:59.336557] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (45060) > buf size (4096) 00:07:43.126 [2024-07-13 10:38:59.336767] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:2c000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.126 [2024-07-13 10:38:59.336792] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.126 #9 NEW cov: 11780 ft: 13228 corp: 5/58b lim: 30 exec/s: 0 rss: 69Mb L: 9/30 MS: 1 ChangeBit- 00:07:43.126 [2024-07-13 10:38:59.376713] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (45636) > buf size (4096) 00:07:43.126 [2024-07-13 10:38:59.376924] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:2c900000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.127 [2024-07-13 10:38:59.376949] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.127 #10 NEW cov: 11780 ft: 13418 corp: 6/67b lim: 30 exec/s: 0 rss: 69Mb L: 9/30 MS: 1 ShuffleBytes- 00:07:43.127 [2024-07-13 10:38:59.416778] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (110596) > buf size (4096) 00:07:43.127 [2024-07-13 10:38:59.416988] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:6c000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.127 [2024-07-13 10:38:59.417018] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.127 #11 NEW cov: 11780 ft: 13507 corp: 7/76b lim: 30 exec/s: 0 rss: 69Mb L: 9/30 MS: 1 ChangeBit- 00:07:43.127 [2024-07-13 10:38:59.456951] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (110888) > buf size (4096) 00:07:43.127 [2024-07-13 10:38:59.457162] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:6c490000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.127 [2024-07-13 10:38:59.457187] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.127 #12 NEW cov: 11780 ft: 13623 corp: 8/85b lim: 30 exec/s: 0 rss: 70Mb L: 9/30 MS: 1 CMP- DE: "I\000\000\000\000\000\000\000"- 00:07:43.127 [2024-07-13 10:38:59.497013] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (45060) > buf size (4096) 00:07:43.127 [2024-07-13 10:38:59.497256] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:2c000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.127 [2024-07-13 10:38:59.497282] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.387 #13 NEW cov: 11780 ft: 13694 corp: 9/95b lim: 30 exec/s: 0 rss: 70Mb L: 10/30 MS: 1 InsertByte- 00:07:43.387 [2024-07-13 10:38:59.537244] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (45060) > buf size (4096) 00:07:43.387 [2024-07-13 10:38:59.537363] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:43.387 [2024-07-13 10:38:59.537479] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (261300) > buf size (4096) 00:07:43.387 [2024-07-13 10:38:59.537584] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x4aff 00:07:43.387 [2024-07-13 10:38:59.537689] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000004a 00:07:43.387 [2024-07-13 10:38:59.537910] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:2c000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.387 [2024-07-13 10:38:59.537935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.387 [2024-07-13 10:38:59.537993] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.387 [2024-07-13 10:38:59.538007] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.387 [2024-07-13 10:38:59.538064] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ff2c0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.387 [2024-07-13 10:38:59.538078] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.387 [2024-07-13 10:38:59.538134] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:d0000024 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.387 [2024-07-13 10:38:59.538147] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:43.387 [2024-07-13 10:38:59.538204] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:8 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.387 [2024-07-13 10:38:59.538217] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:43.387 #14 NEW cov: 11780 ft: 13825 corp: 10/125b lim: 30 exec/s: 0 rss: 70Mb L: 30/30 MS: 1 CrossOver- 00:07:43.387 [2024-07-13 10:38:59.577299] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (45060) > buf size (4096) 00:07:43.387 [2024-07-13 10:38:59.577518] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:2c000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.387 [2024-07-13 10:38:59.577549] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.387 #15 NEW cov: 11780 ft: 13849 corp: 11/134b lim: 30 exec/s: 0 rss: 70Mb L: 9/30 MS: 1 ChangeBinInt- 00:07:43.387 [2024-07-13 10:38:59.617404] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (46080) > buf size (4096) 00:07:43.387 [2024-07-13 10:38:59.617632] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:2cff00f7 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.387 [2024-07-13 10:38:59.617657] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.387 #16 NEW cov: 11780 ft: 13962 corp: 12/143b lim: 30 exec/s: 0 rss: 70Mb L: 9/30 MS: 1 ChangeBinInt- 00:07:43.387 [2024-07-13 10:38:59.657561] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (45060) > buf size (4096) 00:07:43.387 [2024-07-13 10:38:59.657775] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:2c000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.387 [2024-07-13 10:38:59.657800] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.387 #17 NEW cov: 11780 ft: 13977 corp: 13/152b lim: 30 exec/s: 0 rss: 70Mb L: 9/30 MS: 1 ChangeByte- 00:07:43.387 [2024-07-13 10:38:59.687607] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (114692) > buf size (4096) 00:07:43.387 [2024-07-13 10:38:59.687825] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:70000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.387 [2024-07-13 10:38:59.687850] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.387 #18 NEW cov: 11780 ft: 14015 corp: 14/161b lim: 30 exec/s: 0 rss: 70Mb L: 9/30 MS: 1 ChangeBinInt- 00:07:43.387 [2024-07-13 10:38:59.727722] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (45060) > buf size (4096) 00:07:43.387 [2024-07-13 10:38:59.728048] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:2c000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.387 [2024-07-13 10:38:59.728073] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.387 [2024-07-13 10:38:59.728127] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.387 [2024-07-13 10:38:59.728141] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.387 #19 NEW cov: 11797 ft: 14366 corp: 15/178b lim: 30 exec/s: 0 rss: 70Mb L: 17/30 MS: 1 PersAutoDict- DE: "I\000\000\000\000\000\000\000"- 00:07:43.387 [2024-07-13 10:38:59.767845] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (45060) > buf size (4096) 00:07:43.387 [2024-07-13 10:38:59.768080] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:2c000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.387 [2024-07-13 10:38:59.768104] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.647 #20 NEW cov: 11797 ft: 14424 corp: 16/188b lim: 30 exec/s: 0 rss: 70Mb L: 10/30 MS: 1 InsertByte- 00:07:43.647 [2024-07-13 10:38:59.807981] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (45060) > buf size (4096) 00:07:43.647 [2024-07-13 10:38:59.808192] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000004a 00:07:43.647 [2024-07-13 10:38:59.808408] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:2c000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.647 [2024-07-13 10:38:59.808434] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.647 [2024-07-13 10:38:59.808498] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:002c0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.647 [2024-07-13 10:38:59.808512] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.647 [2024-07-13 10:38:59.808567] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000200 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.647 [2024-07-13 10:38:59.808580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.647 #21 NEW cov: 11797 ft: 14648 corp: 17/206b lim: 30 exec/s: 0 rss: 70Mb L: 18/30 MS: 1 CrossOver- 00:07:43.647 [2024-07-13 10:38:59.848057] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (45060) > buf size (4096) 00:07:43.647 [2024-07-13 10:38:59.848272] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:2c000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.647 [2024-07-13 10:38:59.848297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.647 NEW_FUNC[1/1]: 0x197bcf0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:43.647 #22 NEW cov: 11820 ft: 14687 corp: 18/215b lim: 30 exec/s: 0 rss: 70Mb L: 9/30 MS: 1 ShuffleBytes- 00:07:43.647 [2024-07-13 10:38:59.888168] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (45060) > buf size (4096) 00:07:43.647 [2024-07-13 10:38:59.888389] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:2c00002c cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.647 [2024-07-13 10:38:59.888415] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.647 #28 NEW cov: 11820 ft: 14704 corp: 19/225b lim: 30 exec/s: 0 rss: 70Mb L: 10/30 MS: 1 PersAutoDict- DE: ",\000\000\000\000\000\000\000"- 00:07:43.647 [2024-07-13 10:38:59.928308] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (1047852) > buf size (4096) 00:07:43.647 [2024-07-13 10:38:59.928528] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ff4a8386 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.647 [2024-07-13 10:38:59.928553] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.647 #32 NEW cov: 11820 ft: 14714 corp: 20/231b lim: 30 exec/s: 32 rss: 70Mb L: 6/30 MS: 4 EraseBytes-ShuffleBytes-ChangeByte-InsertByte- 00:07:43.647 [2024-07-13 10:38:59.968397] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (45352) > buf size (4096) 00:07:43.647 [2024-07-13 10:38:59.968625] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:2c490000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.647 [2024-07-13 10:38:59.968651] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.647 #37 NEW cov: 11820 ft: 14727 corp: 21/240b lim: 30 exec/s: 37 rss: 70Mb L: 9/30 MS: 5 CrossOver-ShuffleBytes-CopyPart-EraseBytes-PersAutoDict- DE: "I\000\000\000\000\000\000\000"- 00:07:43.647 [2024-07-13 10:39:00.008562] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (114692) > buf size (4096) 00:07:43.647 [2024-07-13 10:39:00.008782] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:70000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.647 [2024-07-13 10:39:00.008809] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.907 #38 NEW cov: 11820 ft: 14741 corp: 22/249b lim: 30 exec/s: 38 rss: 70Mb L: 9/30 MS: 1 ChangeBinInt- 00:07:43.907 [2024-07-13 10:39:00.048719] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (261128) > buf size (4096) 00:07:43.907 [2024-07-13 10:39:00.048844] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x2000086f7 00:07:43.907 [2024-07-13 10:39:00.049048] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ff010004 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.907 [2024-07-13 10:39:00.049075] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.907 [2024-07-13 10:39:00.049134] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000200 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.907 [2024-07-13 10:39:00.049149] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.907 #39 NEW cov: 11820 ft: 14760 corp: 23/263b lim: 30 exec/s: 39 rss: 70Mb L: 14/30 MS: 1 CMP- DE: "\001\004\000\000\000\000\000\000"- 00:07:43.907 [2024-07-13 10:39:00.088774] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (51780) > buf size (4096) 00:07:43.907 [2024-07-13 10:39:00.088990] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:32900000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.907 [2024-07-13 10:39:00.089015] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.907 #40 NEW cov: 11820 ft: 14766 corp: 24/272b lim: 30 exec/s: 40 rss: 70Mb L: 9/30 MS: 1 ChangeByte- 00:07:43.907 [2024-07-13 10:39:00.128898] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (46080) > buf size (4096) 00:07:43.907 [2024-07-13 10:39:00.129113] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:2cff00f7 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.907 [2024-07-13 10:39:00.129139] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.907 #41 NEW cov: 11820 ft: 14776 corp: 25/282b lim: 30 exec/s: 41 rss: 70Mb L: 10/30 MS: 1 InsertByte- 00:07:43.907 [2024-07-13 10:39:00.168970] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xfff7 00:07:43.907 [2024-07-13 10:39:00.169183] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:2cff00f7 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.907 [2024-07-13 10:39:00.169209] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.907 #42 NEW cov: 11820 ft: 14800 corp: 26/292b lim: 30 exec/s: 42 rss: 70Mb L: 10/30 MS: 1 CopyPart- 00:07:43.907 [2024-07-13 10:39:00.209135] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (74756) > buf size (4096) 00:07:43.907 [2024-07-13 10:39:00.209354] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:49000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.907 [2024-07-13 10:39:00.209380] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.907 #43 NEW cov: 11820 ft: 14813 corp: 27/301b lim: 30 exec/s: 43 rss: 70Mb L: 9/30 MS: 1 PersAutoDict- DE: "I\000\000\000\000\000\000\000"- 00:07:43.907 [2024-07-13 10:39:00.249275] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (261420) > buf size (4096) 00:07:43.907 [2024-07-13 10:39:00.249508] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ff4a0086 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.907 [2024-07-13 10:39:00.249534] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.907 #44 NEW cov: 11820 ft: 14820 corp: 28/307b lim: 30 exec/s: 44 rss: 70Mb L: 6/30 MS: 1 CopyPart- 00:07:43.907 [2024-07-13 10:39:00.289501] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (45060) > buf size (4096) 00:07:43.907 [2024-07-13 10:39:00.289619] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (787456) > buf size (4096) 00:07:43.907 [2024-07-13 10:39:00.289729] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (262144) > buf size (4096) 00:07:43.907 [2024-07-13 10:39:00.289841] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x4aff 00:07:43.907 [2024-07-13 10:39:00.289956] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000004a 00:07:43.907 [2024-07-13 10:39:00.290171] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:2c000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.907 [2024-07-13 10:39:00.290196] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.907 [2024-07-13 10:39:00.290252] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.907 [2024-07-13 10:39:00.290267] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.907 [2024-07-13 10:39:00.290321] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.907 [2024-07-13 10:39:00.290335] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.907 [2024-07-13 10:39:00.290389] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:d0000024 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.908 [2024-07-13 10:39:00.290402] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:43.908 [2024-07-13 10:39:00.290459] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:8 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:43.908 [2024-07-13 10:39:00.290472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:44.167 #45 NEW cov: 11820 ft: 14889 corp: 29/337b lim: 30 exec/s: 45 rss: 70Mb L: 30/30 MS: 1 ShuffleBytes- 00:07:44.167 [2024-07-13 10:39:00.329658] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (45060) > buf size (4096) 00:07:44.167 [2024-07-13 10:39:00.329776] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:44.167 [2024-07-13 10:39:00.329901] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:44.167 [2024-07-13 10:39:00.330007] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:44.167 [2024-07-13 10:39:00.330115] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (1048576) > buf size (4096) 00:07:44.167 [2024-07-13 10:39:00.330333] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:2c000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.167 [2024-07-13 10:39:00.330359] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.167 [2024-07-13 10:39:00.330417] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.167 [2024-07-13 10:39:00.330432] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.167 [2024-07-13 10:39:00.330491] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.167 [2024-07-13 10:39:00.330506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.167 [2024-07-13 10:39:00.330560] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.167 [2024-07-13 10:39:00.330574] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:44.167 [2024-07-13 10:39:00.330632] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:8 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.167 [2024-07-13 10:39:00.330645] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:44.167 #46 NEW cov: 11820 ft: 14896 corp: 30/367b lim: 30 exec/s: 46 rss: 70Mb L: 30/30 MS: 1 CrossOver- 00:07:44.167 [2024-07-13 10:39:00.369578] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (45476) > buf size (4096) 00:07:44.167 [2024-07-13 10:39:00.369780] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:2c6800f7 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.167 [2024-07-13 10:39:00.369804] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.167 #47 NEW cov: 11820 ft: 14921 corp: 31/377b lim: 30 exec/s: 47 rss: 70Mb L: 10/30 MS: 1 ChangeByte- 00:07:44.167 [2024-07-13 10:39:00.409702] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (45060) > buf size (4096) 00:07:44.167 [2024-07-13 10:39:00.409905] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:2c000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.167 [2024-07-13 10:39:00.409931] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.167 #48 NEW cov: 11820 ft: 14923 corp: 32/387b lim: 30 exec/s: 48 rss: 70Mb L: 10/30 MS: 1 InsertByte- 00:07:44.167 [2024-07-13 10:39:00.439887] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (45224) > buf size (4096) 00:07:44.167 [2024-07-13 10:39:00.440001] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (212996) > buf size (4096) 00:07:44.167 [2024-07-13 10:39:00.440105] ctrlr.c:2547:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: offset (18944) > len (4) 00:07:44.167 [2024-07-13 10:39:00.440333] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:2c290000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.167 [2024-07-13 10:39:00.440359] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.167 [2024-07-13 10:39:00.440418] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:d000002c cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.168 [2024-07-13 10:39:00.440432] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.168 [2024-07-13 10:39:00.440489] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.168 [2024-07-13 10:39:00.440503] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.168 #49 NEW cov: 11826 ft: 14933 corp: 33/406b lim: 30 exec/s: 49 rss: 70Mb L: 19/30 MS: 1 InsertByte- 00:07:44.168 [2024-07-13 10:39:00.479987] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (1047852) > buf size (4096) 00:07:44.168 [2024-07-13 10:39:00.480100] ctrlr.c:2547:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: offset (224) > len (4) 00:07:44.168 [2024-07-13 10:39:00.480300] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ff4a8386 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.168 [2024-07-13 10:39:00.480326] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.168 [2024-07-13 10:39:00.480385] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.168 [2024-07-13 10:39:00.480399] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.168 #50 NEW cov: 11826 ft: 14953 corp: 34/422b lim: 30 exec/s: 50 rss: 70Mb L: 16/30 MS: 1 CrossOver- 00:07:44.168 [2024-07-13 10:39:00.519984] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (46080) > buf size (4096) 00:07:44.168 [2024-07-13 10:39:00.520220] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:2cff00c7 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.168 [2024-07-13 10:39:00.520247] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.168 #51 NEW cov: 11826 ft: 14991 corp: 35/432b lim: 30 exec/s: 51 rss: 70Mb L: 10/30 MS: 1 ChangeByte- 00:07:44.168 [2024-07-13 10:39:00.550144] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (46080) > buf size (4096) 00:07:44.168 [2024-07-13 10:39:00.550472] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:2cff00f7 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.168 [2024-07-13 10:39:00.550498] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.168 [2024-07-13 10:39:00.550554] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.168 [2024-07-13 10:39:00.550568] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.427 #52 NEW cov: 11826 ft: 15062 corp: 36/449b lim: 30 exec/s: 52 rss: 70Mb L: 17/30 MS: 1 PersAutoDict- DE: ",\000\000\000\000\000\000\000"- 00:07:44.427 [2024-07-13 10:39:00.590329] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (262148) > buf size (4096) 00:07:44.427 [2024-07-13 10:39:00.590448] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x4a 00:07:44.427 [2024-07-13 10:39:00.590677] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:01040000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.427 [2024-07-13 10:39:00.590703] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.428 [2024-07-13 10:39:00.590761] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:000081ff cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.428 [2024-07-13 10:39:00.590776] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.428 [2024-07-13 10:39:00.590833] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.428 [2024-07-13 10:39:00.590847] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.428 #53 NEW cov: 11826 ft: 15074 corp: 37/471b lim: 30 exec/s: 53 rss: 70Mb L: 22/30 MS: 1 PersAutoDict- DE: "\001\004\000\000\000\000\000\000"- 00:07:44.428 [2024-07-13 10:39:00.630476] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (45060) > buf size (4096) 00:07:44.428 [2024-07-13 10:39:00.630591] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (787456) > buf size (4096) 00:07:44.428 [2024-07-13 10:39:00.630699] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (261124) > buf size (4096) 00:07:44.428 [2024-07-13 10:39:00.630804] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x4aff 00:07:44.428 [2024-07-13 10:39:00.630911] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000004a 00:07:44.428 [2024-07-13 10:39:00.631124] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:2c000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.428 [2024-07-13 10:39:00.631150] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.428 [2024-07-13 10:39:00.631208] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.428 [2024-07-13 10:39:00.631222] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.428 [2024-07-13 10:39:00.631282] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ff0000ff cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.428 [2024-07-13 10:39:00.631297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.428 [2024-07-13 10:39:00.631350] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:d0000024 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.428 [2024-07-13 10:39:00.631364] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:44.428 [2024-07-13 10:39:00.631419] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:8 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.428 [2024-07-13 10:39:00.631432] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:44.428 #54 NEW cov: 11826 ft: 15107 corp: 38/501b lim: 30 exec/s: 54 rss: 70Mb L: 30/30 MS: 1 ShuffleBytes- 00:07:44.428 [2024-07-13 10:39:00.670558] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (1047852) > buf size (4096) 00:07:44.428 [2024-07-13 10:39:00.670675] ctrlr.c:2547:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: offset (224) > len (4) 00:07:44.428 [2024-07-13 10:39:00.670891] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ff4a8386 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.428 [2024-07-13 10:39:00.670916] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.428 [2024-07-13 10:39:00.670972] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.428 [2024-07-13 10:39:00.670986] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.428 #55 NEW cov: 11826 ft: 15117 corp: 39/517b lim: 30 exec/s: 55 rss: 70Mb L: 16/30 MS: 1 ChangeBinInt- 00:07:44.428 [2024-07-13 10:39:00.710554] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x3f 00:07:44.428 [2024-07-13 10:39:00.710774] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:2c00002b cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.428 [2024-07-13 10:39:00.710799] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.428 #56 NEW cov: 11826 ft: 15142 corp: 40/528b lim: 30 exec/s: 56 rss: 70Mb L: 11/30 MS: 1 InsertByte- 00:07:44.428 [2024-07-13 10:39:00.750734] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (45476) > buf size (4096) 00:07:44.428 [2024-07-13 10:39:00.750948] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:2c6800f7 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.428 [2024-07-13 10:39:00.750973] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.428 #57 NEW cov: 11826 ft: 15153 corp: 41/538b lim: 30 exec/s: 57 rss: 70Mb L: 10/30 MS: 1 CMP- DE: "y\012"- 00:07:44.428 [2024-07-13 10:39:00.790902] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (45636) > buf size (4096) 00:07:44.428 [2024-07-13 10:39:00.791398] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:2c900000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.428 [2024-07-13 10:39:00.791424] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.428 [2024-07-13 10:39:00.791480] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.428 [2024-07-13 10:39:00.791495] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.428 [2024-07-13 10:39:00.791552] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.428 [2024-07-13 10:39:00.791565] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.428 [2024-07-13 10:39:00.791620] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.428 [2024-07-13 10:39:00.791633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:44.428 #58 NEW cov: 11826 ft: 15203 corp: 42/563b lim: 30 exec/s: 58 rss: 71Mb L: 25/30 MS: 1 InsertRepeatedBytes- 00:07:44.688 [2024-07-13 10:39:00.831035] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (45060) > buf size (4096) 00:07:44.688 [2024-07-13 10:39:00.831149] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:44.688 [2024-07-13 10:39:00.831255] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (261300) > buf size (4096) 00:07:44.688 [2024-07-13 10:39:00.831360] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x4aff 00:07:44.688 [2024-07-13 10:39:00.831468] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300000042 00:07:44.688 [2024-07-13 10:39:00.831652] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:2c000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.688 [2024-07-13 10:39:00.831679] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.688 [2024-07-13 10:39:00.831734] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.688 [2024-07-13 10:39:00.831748] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.688 [2024-07-13 10:39:00.831804] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ff2c0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.688 [2024-07-13 10:39:00.831816] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.688 [2024-07-13 10:39:00.831873] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:d0000024 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.688 [2024-07-13 10:39:00.831886] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:44.688 [2024-07-13 10:39:00.831942] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:8 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.688 [2024-07-13 10:39:00.831956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:44.688 #59 NEW cov: 11826 ft: 15222 corp: 43/593b lim: 30 exec/s: 59 rss: 71Mb L: 30/30 MS: 1 ChangeBinInt- 00:07:44.688 [2024-07-13 10:39:00.871035] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (110888) > buf size (4096) 00:07:44.688 [2024-07-13 10:39:00.871268] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:6c490000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.688 [2024-07-13 10:39:00.871293] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.688 #60 NEW cov: 11826 ft: 15329 corp: 44/602b lim: 30 exec/s: 60 rss: 71Mb L: 9/30 MS: 1 CrossOver- 00:07:44.688 [2024-07-13 10:39:00.911246] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xff 00:07:44.688 [2024-07-13 10:39:00.911362] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (41380) > buf size (4096) 00:07:44.688 [2024-07-13 10:39:00.911482] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000004a 00:07:44.688 [2024-07-13 10:39:00.911700] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:2c000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.688 [2024-07-13 10:39:00.911725] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.688 [2024-07-13 10:39:00.911784] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:286800a9 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.688 [2024-07-13 10:39:00.911798] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.688 [2024-07-13 10:39:00.911851] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:48908100 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:44.688 [2024-07-13 10:39:00.911864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.688 #61 NEW cov: 11826 ft: 15332 corp: 45/620b lim: 30 exec/s: 30 rss: 71Mb L: 18/30 MS: 1 CMP- DE: "\377(h\251\254^\304H"- 00:07:44.688 #61 DONE cov: 11826 ft: 15332 corp: 45/620b lim: 30 exec/s: 30 rss: 71Mb 00:07:44.688 ###### Recommended dictionary. ###### 00:07:44.688 ",\000\000\000\000\000\000\000" # Uses: 2 00:07:44.688 "I\000\000\000\000\000\000\000" # Uses: 3 00:07:44.688 "\001\004\000\000\000\000\000\000" # Uses: 1 00:07:44.688 "y\012" # Uses: 0 00:07:44.688 "\377(h\251\254^\304H" # Uses: 0 00:07:44.688 ###### End of recommended dictionary. ###### 00:07:44.688 Done 61 runs in 2 second(s) 00:07:44.688 10:39:01 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_1.conf 00:07:44.688 10:39:01 -- ../common.sh@72 -- # (( i++ )) 00:07:44.688 10:39:01 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:44.688 10:39:01 -- ../common.sh@73 -- # start_llvm_fuzz 2 1 0x1 00:07:44.688 10:39:01 -- nvmf/run.sh@23 -- # local fuzzer_type=2 00:07:44.688 10:39:01 -- nvmf/run.sh@24 -- # local timen=1 00:07:44.688 10:39:01 -- nvmf/run.sh@25 -- # local core=0x1 00:07:44.688 10:39:01 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:07:44.688 10:39:01 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_2.conf 00:07:44.688 10:39:01 -- nvmf/run.sh@29 -- # printf %02d 2 00:07:44.688 10:39:01 -- nvmf/run.sh@29 -- # port=4402 00:07:44.688 10:39:01 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:07:44.688 10:39:01 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4402' 00:07:44.688 10:39:01 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4402"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:44.688 10:39:01 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4402' -c /tmp/fuzz_json_2.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 -Z 2 -r /var/tmp/spdk2.sock 00:07:44.948 [2024-07-13 10:39:01.093530] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:07:44.948 [2024-07-13 10:39:01.093597] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1989786 ] 00:07:44.948 EAL: No free 2048 kB hugepages reported on node 1 00:07:44.948 [2024-07-13 10:39:01.274163] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:44.948 [2024-07-13 10:39:01.293438] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:44.948 [2024-07-13 10:39:01.293619] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:45.207 [2024-07-13 10:39:01.345087] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:45.207 [2024-07-13 10:39:01.361371] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4402 *** 00:07:45.207 INFO: Running with entropic power schedule (0xFF, 100). 00:07:45.207 INFO: Seed: 3388249821 00:07:45.207 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x26ac74c, 0x26ffca9), 00:07:45.207 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x26ffcb0,0x2c35280), 00:07:45.207 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:07:45.207 INFO: A corpus is not provided, starting from an empty corpus 00:07:45.207 #2 INITED exec/s: 0 rss: 60Mb 00:07:45.207 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:45.207 This may also happen if the target rejected all inputs we tried so far 00:07:45.207 [2024-07-13 10:39:01.430983] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff007c cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.207 [2024-07-13 10:39:01.431034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.207 [2024-07-13 10:39:01.431191] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.207 [2024-07-13 10:39:01.431217] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.468 NEW_FUNC[1/670]: 0x4a1a20 in fuzz_admin_identify_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:95 00:07:45.468 NEW_FUNC[2/670]: 0x4dac50 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:45.468 #9 NEW cov: 11511 ft: 11507 corp: 2/16b lim: 35 exec/s: 0 rss: 67Mb L: 15/15 MS: 2 ChangeByte-InsertRepeatedBytes- 00:07:45.468 [2024-07-13 10:39:01.771811] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ff7c00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.468 [2024-07-13 10:39:01.771855] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.468 [2024-07-13 10:39:01.771947] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.468 [2024-07-13 10:39:01.771970] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.468 #10 NEW cov: 11624 ft: 12156 corp: 3/31b lim: 35 exec/s: 0 rss: 68Mb L: 15/15 MS: 1 ShuffleBytes- 00:07:45.468 [2024-07-13 10:39:01.821553] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:45.468 [2024-07-13 10:39:01.821726] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:45.468 [2024-07-13 10:39:01.821889] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:45.468 [2024-07-13 10:39:01.822046] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:45.468 [2024-07-13 10:39:01.822198] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:45.468 [2024-07-13 10:39:01.822532] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.468 [2024-07-13 10:39:01.822569] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.468 [2024-07-13 10:39:01.822695] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.468 [2024-07-13 10:39:01.822721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.468 [2024-07-13 10:39:01.822843] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.468 [2024-07-13 10:39:01.822867] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.468 [2024-07-13 10:39:01.822987] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.468 [2024-07-13 10:39:01.823010] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:45.468 [2024-07-13 10:39:01.823143] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.468 [2024-07-13 10:39:01.823162] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:45.468 #11 NEW cov: 11639 ft: 13009 corp: 4/66b lim: 35 exec/s: 0 rss: 69Mb L: 35/35 MS: 1 InsertRepeatedBytes- 00:07:45.728 [2024-07-13 10:39:01.861916] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff007c cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.728 [2024-07-13 10:39:01.861946] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.728 [2024-07-13 10:39:01.862064] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.728 [2024-07-13 10:39:01.862082] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.728 #12 NEW cov: 11724 ft: 13333 corp: 5/82b lim: 35 exec/s: 0 rss: 69Mb L: 16/35 MS: 1 InsertByte- 00:07:45.728 [2024-07-13 10:39:01.901968] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff007c cdw11:9f00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.728 [2024-07-13 10:39:01.901996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.728 [2024-07-13 10:39:01.902111] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.728 [2024-07-13 10:39:01.902128] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.728 #13 NEW cov: 11724 ft: 13391 corp: 6/98b lim: 35 exec/s: 0 rss: 69Mb L: 16/35 MS: 1 ShuffleBytes- 00:07:45.728 [2024-07-13 10:39:01.941919] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:45.728 [2024-07-13 10:39:01.942084] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:45.728 [2024-07-13 10:39:01.942241] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:45.728 [2024-07-13 10:39:01.942390] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:45.728 [2024-07-13 10:39:01.942548] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:45.728 [2024-07-13 10:39:01.942875] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.728 [2024-07-13 10:39:01.942904] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.728 [2024-07-13 10:39:01.943026] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.728 [2024-07-13 10:39:01.943049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.728 [2024-07-13 10:39:01.943170] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:01000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.728 [2024-07-13 10:39:01.943196] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.728 [2024-07-13 10:39:01.943317] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:01000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.728 [2024-07-13 10:39:01.943339] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:45.728 [2024-07-13 10:39:01.943467] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.728 [2024-07-13 10:39:01.943493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:45.728 #14 NEW cov: 11724 ft: 13540 corp: 7/133b lim: 35 exec/s: 0 rss: 69Mb L: 35/35 MS: 1 ChangeBinInt- 00:07:45.728 [2024-07-13 10:39:01.992087] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:45.728 [2024-07-13 10:39:01.992267] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:45.728 [2024-07-13 10:39:01.992420] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:45.728 [2024-07-13 10:39:01.992578] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:45.728 [2024-07-13 10:39:01.992718] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:45.728 [2024-07-13 10:39:01.993060] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.728 [2024-07-13 10:39:01.993093] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.728 [2024-07-13 10:39:01.993212] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.728 [2024-07-13 10:39:01.993235] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.728 [2024-07-13 10:39:01.993360] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.728 [2024-07-13 10:39:01.993384] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.728 [2024-07-13 10:39:01.993496] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.729 [2024-07-13 10:39:01.993517] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:45.729 [2024-07-13 10:39:01.993634] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.729 [2024-07-13 10:39:01.993658] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:45.729 #20 NEW cov: 11724 ft: 13588 corp: 8/168b lim: 35 exec/s: 0 rss: 69Mb L: 35/35 MS: 1 ShuffleBytes- 00:07:45.729 [2024-07-13 10:39:02.032164] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff007c cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.729 [2024-07-13 10:39:02.032192] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.729 #21 NEW cov: 11724 ft: 13953 corp: 9/177b lim: 35 exec/s: 0 rss: 69Mb L: 9/35 MS: 1 EraseBytes- 00:07:45.729 [2024-07-13 10:39:02.072522] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff007c cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.729 [2024-07-13 10:39:02.072550] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.729 [2024-07-13 10:39:02.072687] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.729 [2024-07-13 10:39:02.072706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.729 #22 NEW cov: 11724 ft: 14062 corp: 10/192b lim: 35 exec/s: 0 rss: 69Mb L: 15/35 MS: 1 ShuffleBytes- 00:07:45.729 [2024-07-13 10:39:02.112244] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ff7c00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.729 [2024-07-13 10:39:02.112274] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.729 [2024-07-13 10:39:02.112413] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.729 [2024-07-13 10:39:02.112432] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.989 #23 NEW cov: 11724 ft: 14240 corp: 11/207b lim: 35 exec/s: 0 rss: 70Mb L: 15/35 MS: 1 CopyPart- 00:07:45.989 [2024-07-13 10:39:02.152499] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:45.989 [2024-07-13 10:39:02.152681] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:45.989 [2024-07-13 10:39:02.152835] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:45.989 [2024-07-13 10:39:02.152987] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:45.989 [2024-07-13 10:39:02.153143] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:45.989 [2024-07-13 10:39:02.153474] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.989 [2024-07-13 10:39:02.153507] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.989 [2024-07-13 10:39:02.153634] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.989 [2024-07-13 10:39:02.153653] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.989 [2024-07-13 10:39:02.153775] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.989 [2024-07-13 10:39:02.153800] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.989 [2024-07-13 10:39:02.153920] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.989 [2024-07-13 10:39:02.153946] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:45.989 [2024-07-13 10:39:02.154068] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.989 [2024-07-13 10:39:02.154088] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:45.989 #24 NEW cov: 11724 ft: 14257 corp: 12/242b lim: 35 exec/s: 0 rss: 70Mb L: 35/35 MS: 1 CopyPart- 00:07:45.989 [2024-07-13 10:39:02.202674] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:45.989 [2024-07-13 10:39:02.202850] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:45.989 [2024-07-13 10:39:02.203011] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:45.989 [2024-07-13 10:39:02.203165] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:45.989 [2024-07-13 10:39:02.203335] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:45.989 [2024-07-13 10:39:02.203709] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.989 [2024-07-13 10:39:02.203742] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.989 [2024-07-13 10:39:02.203861] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:005b0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.989 [2024-07-13 10:39:02.203888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.989 [2024-07-13 10:39:02.204009] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.989 [2024-07-13 10:39:02.204032] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.989 [2024-07-13 10:39:02.204155] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.989 [2024-07-13 10:39:02.204179] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:45.989 [2024-07-13 10:39:02.204307] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.989 [2024-07-13 10:39:02.204330] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:45.989 #25 NEW cov: 11724 ft: 14307 corp: 13/277b lim: 35 exec/s: 0 rss: 70Mb L: 35/35 MS: 1 ChangeByte- 00:07:45.989 [2024-07-13 10:39:02.253118] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ff7c00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.989 [2024-07-13 10:39:02.253147] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.989 [2024-07-13 10:39:02.253283] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ff2c00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.989 [2024-07-13 10:39:02.253301] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.989 #26 NEW cov: 11724 ft: 14342 corp: 14/292b lim: 35 exec/s: 0 rss: 70Mb L: 15/35 MS: 1 ChangeByte- 00:07:45.989 [2024-07-13 10:39:02.293216] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff007c cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.989 [2024-07-13 10:39:02.293245] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.989 [2024-07-13 10:39:02.293369] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.989 [2024-07-13 10:39:02.293388] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.989 NEW_FUNC[1/1]: 0x197bcf0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:45.989 #27 NEW cov: 11747 ft: 14379 corp: 15/307b lim: 35 exec/s: 0 rss: 70Mb L: 15/35 MS: 1 ShuffleBytes- 00:07:45.989 [2024-07-13 10:39:02.332894] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:45.989 [2024-07-13 10:39:02.333065] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:45.990 [2024-07-13 10:39:02.333214] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:45.990 [2024-07-13 10:39:02.333378] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:45.990 [2024-07-13 10:39:02.333561] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:45.990 [2024-07-13 10:39:02.333896] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:23000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.990 [2024-07-13 10:39:02.333929] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.990 [2024-07-13 10:39:02.334069] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.990 [2024-07-13 10:39:02.334094] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.990 [2024-07-13 10:39:02.334216] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.990 [2024-07-13 10:39:02.334241] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.990 [2024-07-13 10:39:02.334368] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.990 [2024-07-13 10:39:02.334393] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:45.990 [2024-07-13 10:39:02.334512] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.990 [2024-07-13 10:39:02.334536] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:45.990 #28 NEW cov: 11747 ft: 14431 corp: 16/342b lim: 35 exec/s: 0 rss: 70Mb L: 35/35 MS: 1 ChangeBinInt- 00:07:45.990 [2024-07-13 10:39:02.373241] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:45.990 [2024-07-13 10:39:02.373410] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:45.990 [2024-07-13 10:39:02.373579] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:45.990 [2024-07-13 10:39:02.373720] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:45.990 [2024-07-13 10:39:02.373880] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:45.990 [2024-07-13 10:39:02.374214] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.990 [2024-07-13 10:39:02.374250] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.990 [2024-07-13 10:39:02.374370] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:20000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.990 [2024-07-13 10:39:02.374397] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.990 [2024-07-13 10:39:02.374516] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:01000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.990 [2024-07-13 10:39:02.374541] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.990 [2024-07-13 10:39:02.374667] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:01000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.990 [2024-07-13 10:39:02.374688] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:45.990 [2024-07-13 10:39:02.374822] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:45.990 [2024-07-13 10:39:02.374846] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:46.298 #29 NEW cov: 11747 ft: 14451 corp: 17/377b lim: 35 exec/s: 29 rss: 70Mb L: 35/35 MS: 1 ChangeBit- 00:07:46.298 [2024-07-13 10:39:02.423329] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:46.298 [2024-07-13 10:39:02.423507] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:46.298 [2024-07-13 10:39:02.423655] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:46.298 [2024-07-13 10:39:02.423812] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:46.298 [2024-07-13 10:39:02.423963] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:46.298 [2024-07-13 10:39:02.424293] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.298 [2024-07-13 10:39:02.424326] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.298 [2024-07-13 10:39:02.424446] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:005b0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.298 [2024-07-13 10:39:02.424466] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.298 [2024-07-13 10:39:02.424582] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.298 [2024-07-13 10:39:02.424603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.298 [2024-07-13 10:39:02.424679] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.298 [2024-07-13 10:39:02.424701] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:46.298 [2024-07-13 10:39:02.424822] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.298 [2024-07-13 10:39:02.424846] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:46.298 #30 NEW cov: 11747 ft: 14467 corp: 18/412b lim: 35 exec/s: 30 rss: 70Mb L: 35/35 MS: 1 ChangeBinInt- 00:07:46.298 [2024-07-13 10:39:02.483253] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:46.299 [2024-07-13 10:39:02.483418] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:46.299 [2024-07-13 10:39:02.483768] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.299 [2024-07-13 10:39:02.483802] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.299 [2024-07-13 10:39:02.483922] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:005b0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.299 [2024-07-13 10:39:02.483947] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.299 #31 NEW cov: 11747 ft: 14536 corp: 19/428b lim: 35 exec/s: 31 rss: 70Mb L: 16/35 MS: 1 CrossOver- 00:07:46.299 [2024-07-13 10:39:02.533948] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0ac2000a cdw11:c200c2c2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.299 [2024-07-13 10:39:02.533980] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.299 [2024-07-13 10:39:02.534108] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:c2c200c2 cdw11:c200c2c2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.299 [2024-07-13 10:39:02.534128] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.299 #33 NEW cov: 11747 ft: 14589 corp: 20/443b lim: 35 exec/s: 33 rss: 70Mb L: 15/35 MS: 2 CopyPart-InsertRepeatedBytes- 00:07:46.299 [2024-07-13 10:39:02.584099] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0ac2000a cdw11:c200c2c2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.299 [2024-07-13 10:39:02.584125] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.299 [2024-07-13 10:39:02.584253] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:c2c200c2 cdw11:c200a4c2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.299 [2024-07-13 10:39:02.584270] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.299 #34 NEW cov: 11747 ft: 14640 corp: 21/458b lim: 35 exec/s: 34 rss: 70Mb L: 15/35 MS: 1 ChangeByte- 00:07:46.299 [2024-07-13 10:39:02.624198] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff007c cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.299 [2024-07-13 10:39:02.624226] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.299 [2024-07-13 10:39:02.624335] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.299 [2024-07-13 10:39:02.624353] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.635 #35 NEW cov: 11747 ft: 14653 corp: 22/473b lim: 35 exec/s: 35 rss: 70Mb L: 15/35 MS: 1 ChangeByte- 00:07:46.635 [2024-07-13 10:39:02.673957] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff007c cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.635 [2024-07-13 10:39:02.673984] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.635 #36 NEW cov: 11747 ft: 14672 corp: 23/484b lim: 35 exec/s: 36 rss: 70Mb L: 11/35 MS: 1 EraseBytes- 00:07:46.635 [2024-07-13 10:39:02.713973] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:46.635 [2024-07-13 10:39:02.714155] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:46.635 [2024-07-13 10:39:02.714513] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.635 [2024-07-13 10:39:02.714544] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.635 [2024-07-13 10:39:02.714659] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:005b0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.635 [2024-07-13 10:39:02.714678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.635 #37 NEW cov: 11747 ft: 14726 corp: 24/500b lim: 35 exec/s: 37 rss: 70Mb L: 16/35 MS: 1 CrossOver- 00:07:46.635 [2024-07-13 10:39:02.754064] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:46.635 [2024-07-13 10:39:02.754242] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:46.635 [2024-07-13 10:39:02.754606] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00bf0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.635 [2024-07-13 10:39:02.754644] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.635 [2024-07-13 10:39:02.754762] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:005b0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.635 [2024-07-13 10:39:02.754786] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.635 #38 NEW cov: 11747 ft: 14741 corp: 25/516b lim: 35 exec/s: 38 rss: 70Mb L: 16/35 MS: 1 ChangeByte- 00:07:46.635 [2024-07-13 10:39:02.794215] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:46.635 [2024-07-13 10:39:02.794399] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:46.635 [2024-07-13 10:39:02.794729] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.635 [2024-07-13 10:39:02.794761] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.635 [2024-07-13 10:39:02.794884] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:005b0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.635 [2024-07-13 10:39:02.794910] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.635 #39 NEW cov: 11747 ft: 14750 corp: 26/532b lim: 35 exec/s: 39 rss: 70Mb L: 16/35 MS: 1 ShuffleBytes- 00:07:46.635 [2024-07-13 10:39:02.834770] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ff7f007c cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.635 [2024-07-13 10:39:02.834797] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.635 [2024-07-13 10:39:02.834917] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.635 [2024-07-13 10:39:02.834935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.635 #40 NEW cov: 11747 ft: 14755 corp: 27/547b lim: 35 exec/s: 40 rss: 70Mb L: 15/35 MS: 1 ChangeBit- 00:07:46.635 [2024-07-13 10:39:02.875010] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ff7c00ff cdw11:ff00ffbf SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.635 [2024-07-13 10:39:02.875040] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.635 [2024-07-13 10:39:02.875166] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.635 [2024-07-13 10:39:02.875183] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.635 #41 NEW cov: 11747 ft: 14761 corp: 28/562b lim: 35 exec/s: 41 rss: 70Mb L: 15/35 MS: 1 ChangeBit- 00:07:46.635 [2024-07-13 10:39:02.914578] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:46.635 [2024-07-13 10:39:02.914740] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:46.635 [2024-07-13 10:39:02.915064] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00002c00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.635 [2024-07-13 10:39:02.915099] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.635 [2024-07-13 10:39:02.915222] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00005b00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.635 [2024-07-13 10:39:02.915246] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.635 #42 NEW cov: 11747 ft: 14791 corp: 29/579b lim: 35 exec/s: 42 rss: 70Mb L: 17/35 MS: 1 InsertByte- 00:07:46.635 [2024-07-13 10:39:02.955213] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff007c cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.635 [2024-07-13 10:39:02.955244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.635 [2024-07-13 10:39:02.955357] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.635 [2024-07-13 10:39:02.955375] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.635 #43 NEW cov: 11747 ft: 14804 corp: 30/594b lim: 35 exec/s: 43 rss: 70Mb L: 15/35 MS: 1 ChangeBinInt- 00:07:46.635 [2024-07-13 10:39:02.994868] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:46.636 [2024-07-13 10:39:02.995336] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.636 [2024-07-13 10:39:02.995369] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.636 [2024-07-13 10:39:02.995492] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:000000ce cdw11:00005b00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.636 [2024-07-13 10:39:02.995511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.895 #44 NEW cov: 11747 ft: 14826 corp: 31/611b lim: 35 exec/s: 44 rss: 70Mb L: 17/35 MS: 1 InsertByte- 00:07:46.895 [2024-07-13 10:39:03.034989] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:46.895 [2024-07-13 10:39:03.035159] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:46.895 [2024-07-13 10:39:03.035486] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.895 [2024-07-13 10:39:03.035520] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.895 [2024-07-13 10:39:03.035635] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:005b0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.895 [2024-07-13 10:39:03.035654] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.895 #45 NEW cov: 11747 ft: 14885 corp: 32/627b lim: 35 exec/s: 45 rss: 70Mb L: 16/35 MS: 1 ShuffleBytes- 00:07:46.895 [2024-07-13 10:39:03.075001] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.895 [2024-07-13 10:39:03.075028] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.895 #46 NEW cov: 11747 ft: 14900 corp: 33/638b lim: 35 exec/s: 46 rss: 70Mb L: 11/35 MS: 1 EraseBytes- 00:07:46.895 [2024-07-13 10:39:03.115321] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:46.895 [2024-07-13 10:39:03.115494] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:46.895 [2024-07-13 10:39:03.115643] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:46.895 [2024-07-13 10:39:03.115795] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:46.895 [2024-07-13 10:39:03.115954] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:46.895 [2024-07-13 10:39:03.116298] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.895 [2024-07-13 10:39:03.116337] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.895 [2024-07-13 10:39:03.116454] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:02000000 cdw11:20000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.895 [2024-07-13 10:39:03.116475] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.895 [2024-07-13 10:39:03.116596] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:01000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.895 [2024-07-13 10:39:03.116622] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.895 [2024-07-13 10:39:03.116753] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:01000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.895 [2024-07-13 10:39:03.116774] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:46.895 [2024-07-13 10:39:03.116910] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.895 [2024-07-13 10:39:03.116932] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:46.895 #47 NEW cov: 11747 ft: 14926 corp: 34/673b lim: 35 exec/s: 47 rss: 70Mb L: 35/35 MS: 1 ChangeBinInt- 00:07:46.895 [2024-07-13 10:39:03.165566] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.895 [2024-07-13 10:39:03.165594] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.895 #48 NEW cov: 11747 ft: 14941 corp: 35/684b lim: 35 exec/s: 48 rss: 70Mb L: 11/35 MS: 1 ChangeBit- 00:07:46.895 [2024-07-13 10:39:03.215875] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ff7c00ff cdw11:ff00fffd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.895 [2024-07-13 10:39:03.215905] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.895 [2024-07-13 10:39:03.216020] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.895 [2024-07-13 10:39:03.216037] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.895 #49 NEW cov: 11747 ft: 14952 corp: 36/699b lim: 35 exec/s: 49 rss: 70Mb L: 15/35 MS: 1 ChangeBit- 00:07:46.895 [2024-07-13 10:39:03.256344] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff000a cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.895 [2024-07-13 10:39:03.256371] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.895 [2024-07-13 10:39:03.256486] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.895 [2024-07-13 10:39:03.256505] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.895 [2024-07-13 10:39:03.256622] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:46.895 [2024-07-13 10:39:03.256639] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.895 #52 NEW cov: 11747 ft: 15132 corp: 37/722b lim: 35 exec/s: 52 rss: 70Mb L: 23/35 MS: 3 ShuffleBytes-ShuffleBytes-InsertRepeatedBytes- 00:07:47.154 [2024-07-13 10:39:03.295805] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:47.154 [2024-07-13 10:39:03.295979] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:47.154 [2024-07-13 10:39:03.296140] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:47.154 [2024-07-13 10:39:03.296296] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:47.154 [2024-07-13 10:39:03.296455] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:47.154 [2024-07-13 10:39:03.296784] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.154 [2024-07-13 10:39:03.296816] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.154 [2024-07-13 10:39:03.296927] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.154 [2024-07-13 10:39:03.296949] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.154 [2024-07-13 10:39:03.297077] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.154 [2024-07-13 10:39:03.297099] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.154 [2024-07-13 10:39:03.297218] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.154 [2024-07-13 10:39:03.297241] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:47.154 [2024-07-13 10:39:03.297354] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.154 [2024-07-13 10:39:03.297376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:47.154 #53 NEW cov: 11747 ft: 15138 corp: 38/757b lim: 35 exec/s: 53 rss: 70Mb L: 35/35 MS: 1 ShuffleBytes- 00:07:47.154 [2024-07-13 10:39:03.336310] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ff10007c cdw11:9f0000ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.154 [2024-07-13 10:39:03.336337] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.154 [2024-07-13 10:39:03.336461] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.154 [2024-07-13 10:39:03.336480] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.154 #54 NEW cov: 11747 ft: 15144 corp: 39/773b lim: 35 exec/s: 54 rss: 70Mb L: 16/35 MS: 1 ChangeBinInt- 00:07:47.154 [2024-07-13 10:39:03.376191] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:47.154 [2024-07-13 10:39:03.376218] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.154 #55 NEW cov: 11747 ft: 15151 corp: 40/784b lim: 35 exec/s: 27 rss: 70Mb L: 11/35 MS: 1 ShuffleBytes- 00:07:47.154 #55 DONE cov: 11747 ft: 15151 corp: 40/784b lim: 35 exec/s: 27 rss: 70Mb 00:07:47.154 Done 55 runs in 2 second(s) 00:07:47.154 10:39:03 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_2.conf 00:07:47.154 10:39:03 -- ../common.sh@72 -- # (( i++ )) 00:07:47.155 10:39:03 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:47.155 10:39:03 -- ../common.sh@73 -- # start_llvm_fuzz 3 1 0x1 00:07:47.155 10:39:03 -- nvmf/run.sh@23 -- # local fuzzer_type=3 00:07:47.155 10:39:03 -- nvmf/run.sh@24 -- # local timen=1 00:07:47.155 10:39:03 -- nvmf/run.sh@25 -- # local core=0x1 00:07:47.155 10:39:03 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:07:47.155 10:39:03 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_3.conf 00:07:47.155 10:39:03 -- nvmf/run.sh@29 -- # printf %02d 3 00:07:47.155 10:39:03 -- nvmf/run.sh@29 -- # port=4403 00:07:47.155 10:39:03 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:07:47.155 10:39:03 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4403' 00:07:47.155 10:39:03 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4403"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:47.155 10:39:03 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4403' -c /tmp/fuzz_json_3.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 -Z 3 -r /var/tmp/spdk3.sock 00:07:47.155 [2024-07-13 10:39:03.540689] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:07:47.155 [2024-07-13 10:39:03.540742] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1990382 ] 00:07:47.413 EAL: No free 2048 kB hugepages reported on node 1 00:07:47.413 [2024-07-13 10:39:03.712487] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:47.413 [2024-07-13 10:39:03.731691] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:47.413 [2024-07-13 10:39:03.731829] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:47.413 [2024-07-13 10:39:03.783485] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:47.413 [2024-07-13 10:39:03.799773] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4403 *** 00:07:47.672 INFO: Running with entropic power schedule (0xFF, 100). 00:07:47.672 INFO: Seed: 1533283142 00:07:47.672 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x26ac74c, 0x26ffca9), 00:07:47.672 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x26ffcb0,0x2c35280), 00:07:47.672 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:07:47.672 INFO: A corpus is not provided, starting from an empty corpus 00:07:47.672 #2 INITED exec/s: 0 rss: 60Mb 00:07:47.672 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:47.672 This may also happen if the target rejected all inputs we tried so far 00:07:47.931 NEW_FUNC[1/656]: 0x4a36f0 in fuzz_admin_abort_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:114 00:07:47.931 NEW_FUNC[2/656]: 0x4dac50 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:47.931 #6 NEW cov: 11411 ft: 11401 corp: 2/17b lim: 20 exec/s: 0 rss: 67Mb L: 16/16 MS: 4 CopyPart-ShuffleBytes-ChangeByte-InsertRepeatedBytes- 00:07:47.931 NEW_FUNC[1/3]: 0x1555260 in nvme_ctrlr_process_init /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_ctrlr.c:3790 00:07:47.931 NEW_FUNC[2/3]: 0x17220d0 in spdk_nvme_probe_poll_async /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme.c:1507 00:07:47.931 #11 NEW cov: 11553 ft: 12330 corp: 3/27b lim: 20 exec/s: 0 rss: 68Mb L: 10/16 MS: 5 CopyPart-ShuffleBytes-InsertByte-ChangeBit-InsertRepeatedBytes- 00:07:47.931 #12 NEW cov: 11559 ft: 12515 corp: 4/43b lim: 20 exec/s: 0 rss: 68Mb L: 16/16 MS: 1 ChangeByte- 00:07:48.190 #13 NEW cov: 11644 ft: 12767 corp: 5/59b lim: 20 exec/s: 0 rss: 68Mb L: 16/16 MS: 1 ChangeBit- 00:07:48.190 #14 NEW cov: 11648 ft: 12933 corp: 6/74b lim: 20 exec/s: 0 rss: 68Mb L: 15/16 MS: 1 CrossOver- 00:07:48.190 #15 NEW cov: 11648 ft: 13113 corp: 7/93b lim: 20 exec/s: 0 rss: 68Mb L: 19/19 MS: 1 CopyPart- 00:07:48.190 #16 NEW cov: 11648 ft: 13291 corp: 8/103b lim: 20 exec/s: 0 rss: 68Mb L: 10/19 MS: 1 ChangeBit- 00:07:48.190 #17 NEW cov: 11648 ft: 13359 corp: 9/120b lim: 20 exec/s: 0 rss: 68Mb L: 17/19 MS: 1 CrossOver- 00:07:48.190 #22 NEW cov: 11648 ft: 13673 corp: 10/124b lim: 20 exec/s: 0 rss: 68Mb L: 4/19 MS: 5 ChangeByte-ChangeBinInt-CopyPart-InsertByte-CopyPart- 00:07:48.190 #23 NEW cov: 11648 ft: 13757 corp: 11/144b lim: 20 exec/s: 0 rss: 68Mb L: 20/20 MS: 1 InsertRepeatedBytes- 00:07:48.449 #24 NEW cov: 11648 ft: 13791 corp: 12/154b lim: 20 exec/s: 0 rss: 69Mb L: 10/20 MS: 1 InsertRepeatedBytes- 00:07:48.449 #25 NEW cov: 11648 ft: 13881 corp: 13/173b lim: 20 exec/s: 0 rss: 70Mb L: 19/20 MS: 1 EraseBytes- 00:07:48.449 #26 NEW cov: 11648 ft: 13937 corp: 14/189b lim: 20 exec/s: 0 rss: 70Mb L: 16/20 MS: 1 ChangeByte- 00:07:48.449 NEW_FUNC[1/5]: 0x115bea0 in nvmf_qpair_abort_request /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/ctrlr.c:3224 00:07:48.449 NEW_FUNC[2/5]: 0x115ca20 in nvmf_qpair_abort_aer /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/ctrlr.c:3166 00:07:48.449 #27 NEW cov: 11754 ft: 14143 corp: 15/207b lim: 20 exec/s: 0 rss: 70Mb L: 18/20 MS: 1 InsertRepeatedBytes- 00:07:48.449 #28 NEW cov: 11754 ft: 14168 corp: 16/227b lim: 20 exec/s: 0 rss: 70Mb L: 20/20 MS: 1 CopyPart- 00:07:48.708 #29 NEW cov: 11754 ft: 14208 corp: 17/243b lim: 20 exec/s: 29 rss: 70Mb L: 16/20 MS: 1 ChangeByte- 00:07:48.708 #30 NEW cov: 11754 ft: 14217 corp: 18/255b lim: 20 exec/s: 30 rss: 70Mb L: 12/20 MS: 1 EraseBytes- 00:07:48.708 #34 NEW cov: 11754 ft: 14244 corp: 19/260b lim: 20 exec/s: 34 rss: 70Mb L: 5/20 MS: 4 InsertByte-CMP-CopyPart-InsertRepeatedBytes- DE: "\000\037"- 00:07:48.708 #35 NEW cov: 11754 ft: 14266 corp: 20/276b lim: 20 exec/s: 35 rss: 70Mb L: 16/20 MS: 1 ShuffleBytes- 00:07:48.708 #36 NEW cov: 11754 ft: 14277 corp: 21/292b lim: 20 exec/s: 36 rss: 70Mb L: 16/20 MS: 1 ShuffleBytes- 00:07:48.708 #37 NEW cov: 11754 ft: 14281 corp: 22/309b lim: 20 exec/s: 37 rss: 70Mb L: 17/20 MS: 1 InsertByte- 00:07:48.967 #38 NEW cov: 11754 ft: 14320 corp: 23/329b lim: 20 exec/s: 38 rss: 70Mb L: 20/20 MS: 1 CopyPart- 00:07:48.967 #39 NEW cov: 11754 ft: 14353 corp: 24/345b lim: 20 exec/s: 39 rss: 70Mb L: 16/20 MS: 1 ChangeBit- 00:07:48.967 #40 NEW cov: 11754 ft: 14400 corp: 25/355b lim: 20 exec/s: 40 rss: 70Mb L: 10/20 MS: 1 CrossOver- 00:07:48.967 #41 NEW cov: 11754 ft: 14415 corp: 26/372b lim: 20 exec/s: 41 rss: 70Mb L: 17/20 MS: 1 ChangeBinInt- 00:07:48.967 #42 NEW cov: 11754 ft: 14461 corp: 27/392b lim: 20 exec/s: 42 rss: 70Mb L: 20/20 MS: 1 ChangeBit- 00:07:48.967 #43 NEW cov: 11754 ft: 14498 corp: 28/410b lim: 20 exec/s: 43 rss: 70Mb L: 18/20 MS: 1 PersAutoDict- DE: "\000\037"- 00:07:49.226 #44 NEW cov: 11754 ft: 14510 corp: 29/429b lim: 20 exec/s: 44 rss: 70Mb L: 19/20 MS: 1 CMP- DE: "\006\000\000\000"- 00:07:49.226 #45 NEW cov: 11754 ft: 14565 corp: 30/433b lim: 20 exec/s: 45 rss: 70Mb L: 4/20 MS: 1 ShuffleBytes- 00:07:49.226 #46 NEW cov: 11754 ft: 14607 corp: 31/450b lim: 20 exec/s: 46 rss: 70Mb L: 17/20 MS: 1 InsertByte- 00:07:49.226 #47 NEW cov: 11754 ft: 14612 corp: 32/454b lim: 20 exec/s: 47 rss: 70Mb L: 4/20 MS: 1 CrossOver- 00:07:49.226 #48 NEW cov: 11754 ft: 14624 corp: 33/472b lim: 20 exec/s: 48 rss: 70Mb L: 18/20 MS: 1 CMP- DE: "\000\000\177\215\354\024\000\211"- 00:07:49.226 #49 NEW cov: 11754 ft: 14634 corp: 34/476b lim: 20 exec/s: 49 rss: 70Mb L: 4/20 MS: 1 ChangeBit- 00:07:49.485 #50 NEW cov: 11754 ft: 14655 corp: 35/492b lim: 20 exec/s: 50 rss: 70Mb L: 16/20 MS: 1 ChangeBinInt- 00:07:49.485 #51 NEW cov: 11754 ft: 14658 corp: 36/509b lim: 20 exec/s: 51 rss: 70Mb L: 17/20 MS: 1 InsertByte- 00:07:49.485 #52 NEW cov: 11754 ft: 14659 corp: 37/529b lim: 20 exec/s: 52 rss: 70Mb L: 20/20 MS: 1 PersAutoDict- DE: "\006\000\000\000"- 00:07:49.485 #53 NEW cov: 11754 ft: 14672 corp: 38/548b lim: 20 exec/s: 53 rss: 70Mb L: 19/20 MS: 1 ChangeBinInt- 00:07:49.485 #54 NEW cov: 11754 ft: 14673 corp: 39/567b lim: 20 exec/s: 54 rss: 70Mb L: 19/20 MS: 1 ChangeBit- 00:07:49.485 #55 NEW cov: 11754 ft: 14676 corp: 40/574b lim: 20 exec/s: 27 rss: 70Mb L: 7/20 MS: 1 CopyPart- 00:07:49.485 #55 DONE cov: 11754 ft: 14676 corp: 40/574b lim: 20 exec/s: 27 rss: 70Mb 00:07:49.485 ###### Recommended dictionary. ###### 00:07:49.485 "\000\037" # Uses: 1 00:07:49.485 "\006\000\000\000" # Uses: 1 00:07:49.485 "\000\000\177\215\354\024\000\211" # Uses: 0 00:07:49.485 ###### End of recommended dictionary. ###### 00:07:49.485 Done 55 runs in 2 second(s) 00:07:49.744 10:39:05 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_3.conf 00:07:49.744 10:39:05 -- ../common.sh@72 -- # (( i++ )) 00:07:49.744 10:39:05 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:49.744 10:39:05 -- ../common.sh@73 -- # start_llvm_fuzz 4 1 0x1 00:07:49.744 10:39:05 -- nvmf/run.sh@23 -- # local fuzzer_type=4 00:07:49.744 10:39:05 -- nvmf/run.sh@24 -- # local timen=1 00:07:49.744 10:39:05 -- nvmf/run.sh@25 -- # local core=0x1 00:07:49.744 10:39:05 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:07:49.744 10:39:05 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_4.conf 00:07:49.744 10:39:05 -- nvmf/run.sh@29 -- # printf %02d 4 00:07:49.744 10:39:05 -- nvmf/run.sh@29 -- # port=4404 00:07:49.744 10:39:05 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:07:49.744 10:39:05 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4404' 00:07:49.744 10:39:05 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4404"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:49.745 10:39:05 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4404' -c /tmp/fuzz_json_4.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 -Z 4 -r /var/tmp/spdk4.sock 00:07:49.745 [2024-07-13 10:39:05.995585] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:07:49.745 [2024-07-13 10:39:05.995677] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1991106 ] 00:07:49.745 EAL: No free 2048 kB hugepages reported on node 1 00:07:50.004 [2024-07-13 10:39:06.180613] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:50.004 [2024-07-13 10:39:06.200266] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:50.004 [2024-07-13 10:39:06.200405] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:50.004 [2024-07-13 10:39:06.251841] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:50.004 [2024-07-13 10:39:06.268129] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4404 *** 00:07:50.004 INFO: Running with entropic power schedule (0xFF, 100). 00:07:50.004 INFO: Seed: 4001285990 00:07:50.004 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x26ac74c, 0x26ffca9), 00:07:50.004 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x26ffcb0,0x2c35280), 00:07:50.004 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:07:50.004 INFO: A corpus is not provided, starting from an empty corpus 00:07:50.004 #2 INITED exec/s: 0 rss: 60Mb 00:07:50.004 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:50.004 This may also happen if the target rejected all inputs we tried so far 00:07:50.004 [2024-07-13 10:39:06.334138] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.004 [2024-07-13 10:39:06.334176] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.263 NEW_FUNC[1/670]: 0x4a47e0 in fuzz_admin_create_io_completion_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:126 00:07:50.263 NEW_FUNC[2/670]: 0x4dac50 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:50.263 #8 NEW cov: 11525 ft: 11532 corp: 2/12b lim: 35 exec/s: 0 rss: 68Mb L: 11/11 MS: 1 InsertRepeatedBytes- 00:07:50.523 [2024-07-13 10:39:06.665145] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.523 [2024-07-13 10:39:06.665190] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.523 NEW_FUNC[1/1]: 0xed8420 in spdk_process_is_primary /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/env.c:290 00:07:50.523 #9 NEW cov: 11645 ft: 12249 corp: 3/23b lim: 35 exec/s: 0 rss: 68Mb L: 11/11 MS: 1 ChangeBinInt- 00:07:50.523 [2024-07-13 10:39:06.715005] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:28000a00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.523 [2024-07-13 10:39:06.715034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.523 #10 NEW cov: 11651 ft: 12465 corp: 4/34b lim: 35 exec/s: 0 rss: 68Mb L: 11/11 MS: 1 ChangeByte- 00:07:50.523 [2024-07-13 10:39:06.755032] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:27ff0a00 cdw11:fffb0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.523 [2024-07-13 10:39:06.755059] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.523 #11 NEW cov: 11736 ft: 12732 corp: 5/45b lim: 35 exec/s: 0 rss: 68Mb L: 11/11 MS: 1 ChangeBinInt- 00:07:50.523 [2024-07-13 10:39:06.795291] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00010000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.523 [2024-07-13 10:39:06.795318] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.523 #12 NEW cov: 11736 ft: 12801 corp: 6/56b lim: 35 exec/s: 0 rss: 68Mb L: 11/11 MS: 1 ChangeBit- 00:07:50.523 [2024-07-13 10:39:06.835073] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.523 [2024-07-13 10:39:06.835098] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.523 #13 NEW cov: 11736 ft: 12895 corp: 7/67b lim: 35 exec/s: 0 rss: 68Mb L: 11/11 MS: 1 ChangeByte- 00:07:50.523 [2024-07-13 10:39:06.875826] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:000a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.523 [2024-07-13 10:39:06.875854] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.523 [2024-07-13 10:39:06.875970] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00002800 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.523 [2024-07-13 10:39:06.875986] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.523 #14 NEW cov: 11736 ft: 13687 corp: 8/87b lim: 35 exec/s: 0 rss: 69Mb L: 20/20 MS: 1 CrossOver- 00:07:50.782 [2024-07-13 10:39:06.925607] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:28000a00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.782 [2024-07-13 10:39:06.925635] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.782 #15 NEW cov: 11736 ft: 13750 corp: 9/95b lim: 35 exec/s: 0 rss: 69Mb L: 8/20 MS: 1 EraseBytes- 00:07:50.782 [2024-07-13 10:39:06.965365] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:27ff0a00 cdw11:fffb0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.782 [2024-07-13 10:39:06.965392] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.782 #16 NEW cov: 11736 ft: 13802 corp: 10/107b lim: 35 exec/s: 0 rss: 69Mb L: 12/20 MS: 1 InsertByte- 00:07:50.782 [2024-07-13 10:39:07.005465] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:28000a00 cdw11:00000003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.782 [2024-07-13 10:39:07.005492] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.782 #17 NEW cov: 11736 ft: 13858 corp: 11/115b lim: 35 exec/s: 0 rss: 69Mb L: 8/20 MS: 1 ChangeByte- 00:07:50.782 [2024-07-13 10:39:07.046098] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:a3000a00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.782 [2024-07-13 10:39:07.046125] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.782 #18 NEW cov: 11736 ft: 13884 corp: 12/126b lim: 35 exec/s: 0 rss: 69Mb L: 11/20 MS: 1 ChangeByte- 00:07:50.782 [2024-07-13 10:39:07.086168] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00010000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.782 [2024-07-13 10:39:07.086198] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.782 #19 NEW cov: 11736 ft: 13888 corp: 13/138b lim: 35 exec/s: 0 rss: 69Mb L: 12/20 MS: 1 InsertByte- 00:07:50.783 [2024-07-13 10:39:07.126414] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:a3000a00 cdw11:00010000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.783 [2024-07-13 10:39:07.126445] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.783 #20 NEW cov: 11736 ft: 13958 corp: 14/149b lim: 35 exec/s: 0 rss: 69Mb L: 11/20 MS: 1 ChangeBinInt- 00:07:50.783 [2024-07-13 10:39:07.166018] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:a3000a00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.783 [2024-07-13 10:39:07.166045] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.042 #21 NEW cov: 11736 ft: 14063 corp: 15/160b lim: 35 exec/s: 0 rss: 69Mb L: 11/20 MS: 1 ChangeBinInt- 00:07:51.042 [2024-07-13 10:39:07.206747] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:27ff0a00 cdw11:fffb0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.042 [2024-07-13 10:39:07.206777] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.042 [2024-07-13 10:39:07.206902] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:85000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.042 [2024-07-13 10:39:07.206921] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.042 NEW_FUNC[1/1]: 0x197bcf0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:51.042 #22 NEW cov: 11759 ft: 14101 corp: 16/180b lim: 35 exec/s: 0 rss: 69Mb L: 20/20 MS: 1 CrossOver- 00:07:51.042 [2024-07-13 10:39:07.266742] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00270a0a cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.042 [2024-07-13 10:39:07.266769] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.042 #23 NEW cov: 11759 ft: 14234 corp: 17/192b lim: 35 exec/s: 0 rss: 69Mb L: 12/20 MS: 1 CrossOver- 00:07:51.042 [2024-07-13 10:39:07.306425] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.042 [2024-07-13 10:39:07.306456] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.042 #24 NEW cov: 11759 ft: 14238 corp: 18/203b lim: 35 exec/s: 24 rss: 69Mb L: 11/20 MS: 1 ChangeBinInt- 00:07:51.042 [2024-07-13 10:39:07.336857] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:25000a00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.042 [2024-07-13 10:39:07.336884] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.042 #25 NEW cov: 11759 ft: 14247 corp: 19/216b lim: 35 exec/s: 25 rss: 69Mb L: 13/20 MS: 1 InsertByte- 00:07:51.042 [2024-07-13 10:39:07.387132] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:a3000a00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.042 [2024-07-13 10:39:07.387160] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.042 #26 NEW cov: 11759 ft: 14307 corp: 20/227b lim: 35 exec/s: 26 rss: 69Mb L: 11/20 MS: 1 CopyPart- 00:07:51.042 [2024-07-13 10:39:07.427647] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00010000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.042 [2024-07-13 10:39:07.427675] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.042 [2024-07-13 10:39:07.427806] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:2d2dfc2d cdw11:2d2d0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.042 [2024-07-13 10:39:07.427823] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.042 [2024-07-13 10:39:07.427945] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:2d2d2d2d cdw11:2dff0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.042 [2024-07-13 10:39:07.427961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.302 #27 NEW cov: 11759 ft: 14552 corp: 21/250b lim: 35 exec/s: 27 rss: 69Mb L: 23/23 MS: 1 InsertRepeatedBytes- 00:07:51.302 [2024-07-13 10:39:07.478051] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:27ff0a00 cdw11:fffb0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.302 [2024-07-13 10:39:07.478078] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.302 [2024-07-13 10:39:07.478198] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:85000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.302 [2024-07-13 10:39:07.478215] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.302 [2024-07-13 10:39:07.478337] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:2d2d2d2d cdw11:2d2d0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.302 [2024-07-13 10:39:07.478354] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.302 #28 NEW cov: 11759 ft: 14636 corp: 22/277b lim: 35 exec/s: 28 rss: 70Mb L: 27/27 MS: 1 CrossOver- 00:07:51.302 [2024-07-13 10:39:07.538213] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0000d30a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.302 [2024-07-13 10:39:07.538242] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.302 [2024-07-13 10:39:07.538364] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000028 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.302 [2024-07-13 10:39:07.538380] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.302 [2024-07-13 10:39:07.538524] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00fc0001 cdw11:ff000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.302 [2024-07-13 10:39:07.538546] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.302 #29 NEW cov: 11759 ft: 14743 corp: 23/298b lim: 35 exec/s: 29 rss: 70Mb L: 21/27 MS: 1 InsertByte- 00:07:51.302 [2024-07-13 10:39:07.587630] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:a3000a00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.302 [2024-07-13 10:39:07.587659] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.302 #30 NEW cov: 11759 ft: 14810 corp: 24/309b lim: 35 exec/s: 30 rss: 70Mb L: 11/27 MS: 1 ShuffleBytes- 00:07:51.302 [2024-07-13 10:39:07.637888] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:25000a00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.302 [2024-07-13 10:39:07.637918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.302 #31 NEW cov: 11759 ft: 14835 corp: 25/322b lim: 35 exec/s: 31 rss: 70Mb L: 13/27 MS: 1 ChangeByte- 00:07:51.302 [2024-07-13 10:39:07.677582] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:a3000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.302 [2024-07-13 10:39:07.677610] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.562 #35 NEW cov: 11759 ft: 14852 corp: 26/335b lim: 35 exec/s: 35 rss: 70Mb L: 13/27 MS: 4 EraseBytes-EraseBytes-CrossOver-CrossOver- 00:07:51.562 [2024-07-13 10:39:07.717742] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:28000a00 cdw11:0a000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.562 [2024-07-13 10:39:07.717770] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.562 #37 NEW cov: 11759 ft: 14854 corp: 27/345b lim: 35 exec/s: 37 rss: 70Mb L: 10/27 MS: 2 EraseBytes-CopyPart- 00:07:51.562 [2024-07-13 10:39:07.768904] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.562 [2024-07-13 10:39:07.768933] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.562 [2024-07-13 10:39:07.769052] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.562 [2024-07-13 10:39:07.769082] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.562 [2024-07-13 10:39:07.769208] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.562 [2024-07-13 10:39:07.769226] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.562 #38 NEW cov: 11759 ft: 14982 corp: 28/370b lim: 35 exec/s: 38 rss: 70Mb L: 25/27 MS: 1 InsertRepeatedBytes- 00:07:51.562 [2024-07-13 10:39:07.808838] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:cfcf0a00 cdw11:cfcf0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.562 [2024-07-13 10:39:07.808869] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.562 [2024-07-13 10:39:07.809001] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:cfcfcfcf cdw11:cfcf0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.562 [2024-07-13 10:39:07.809019] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.562 [2024-07-13 10:39:07.809140] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:cfcfcfcf cdw11:cfcf0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.562 [2024-07-13 10:39:07.809157] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.562 [2024-07-13 10:39:07.809267] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.562 [2024-07-13 10:39:07.809285] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:51.562 #39 NEW cov: 11759 ft: 15289 corp: 29/399b lim: 35 exec/s: 39 rss: 70Mb L: 29/29 MS: 1 InsertRepeatedBytes- 00:07:51.562 [2024-07-13 10:39:07.848004] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.562 [2024-07-13 10:39:07.848032] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.562 #40 NEW cov: 11759 ft: 15300 corp: 30/410b lim: 35 exec/s: 40 rss: 70Mb L: 11/29 MS: 1 CopyPart- 00:07:51.562 [2024-07-13 10:39:07.888141] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff0af9 cdw11:ff010000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.562 [2024-07-13 10:39:07.888169] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.562 #41 NEW cov: 11759 ft: 15311 corp: 31/422b lim: 35 exec/s: 41 rss: 70Mb L: 12/29 MS: 1 ChangeBinInt- 00:07:51.562 [2024-07-13 10:39:07.928535] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00002a00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.562 [2024-07-13 10:39:07.928562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.821 #42 NEW cov: 11759 ft: 15316 corp: 32/433b lim: 35 exec/s: 42 rss: 70Mb L: 11/29 MS: 1 ChangeBit- 00:07:51.821 [2024-07-13 10:39:07.978952] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00002a00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.821 [2024-07-13 10:39:07.978981] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.821 #43 NEW cov: 11759 ft: 15330 corp: 33/445b lim: 35 exec/s: 43 rss: 70Mb L: 12/29 MS: 1 InsertByte- 00:07:51.821 [2024-07-13 10:39:08.029080] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:28000a00 cdw11:00000003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.821 [2024-07-13 10:39:08.029108] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.821 #49 NEW cov: 11759 ft: 15374 corp: 34/453b lim: 35 exec/s: 49 rss: 70Mb L: 8/29 MS: 1 ChangeByte- 00:07:51.821 [2024-07-13 10:39:08.079216] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:dd000a00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.821 [2024-07-13 10:39:08.079243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.821 #50 NEW cov: 11759 ft: 15404 corp: 35/464b lim: 35 exec/s: 50 rss: 70Mb L: 11/29 MS: 1 ChangeByte- 00:07:51.821 [2024-07-13 10:39:08.119292] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:28000a00 cdw11:0a000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.821 [2024-07-13 10:39:08.119320] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.821 #51 NEW cov: 11759 ft: 15410 corp: 36/474b lim: 35 exec/s: 51 rss: 70Mb L: 10/29 MS: 1 ChangeByte- 00:07:51.821 [2024-07-13 10:39:08.159135] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:28680aff cdw11:ade10003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.821 [2024-07-13 10:39:08.159161] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.821 #52 NEW cov: 11759 ft: 15416 corp: 37/485b lim: 35 exec/s: 52 rss: 70Mb L: 11/29 MS: 1 CMP- DE: "\377(h\255\341\374\002\004"- 00:07:52.080 [2024-07-13 10:39:08.210150] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00010000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.080 [2024-07-13 10:39:08.210178] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.080 [2024-07-13 10:39:08.210333] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:2d2dfc2d cdw11:2d2d0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.080 [2024-07-13 10:39:08.210352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.080 [2024-07-13 10:39:08.210476] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:2d2d2d2d cdw11:2dff0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.080 [2024-07-13 10:39:08.210495] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.080 #53 NEW cov: 11759 ft: 15423 corp: 38/508b lim: 35 exec/s: 53 rss: 70Mb L: 23/29 MS: 1 ChangeBit- 00:07:52.080 [2024-07-13 10:39:08.259295] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff0af9 cdw11:28680001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.080 [2024-07-13 10:39:08.259323] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.080 #54 NEW cov: 11759 ft: 15459 corp: 39/520b lim: 35 exec/s: 54 rss: 70Mb L: 12/29 MS: 1 PersAutoDict- DE: "\377(h\255\341\374\002\004"- 00:07:52.080 [2024-07-13 10:39:08.299411] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:27030a00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.080 [2024-07-13 10:39:08.299438] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.080 #55 NEW cov: 11759 ft: 15464 corp: 40/532b lim: 35 exec/s: 27 rss: 70Mb L: 12/29 MS: 1 CMP- DE: "\003\000\000\000"- 00:07:52.080 #55 DONE cov: 11759 ft: 15464 corp: 40/532b lim: 35 exec/s: 27 rss: 70Mb 00:07:52.080 ###### Recommended dictionary. ###### 00:07:52.080 "\377(h\255\341\374\002\004" # Uses: 1 00:07:52.080 "\003\000\000\000" # Uses: 0 00:07:52.080 ###### End of recommended dictionary. ###### 00:07:52.080 Done 55 runs in 2 second(s) 00:07:52.080 10:39:08 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_4.conf 00:07:52.080 10:39:08 -- ../common.sh@72 -- # (( i++ )) 00:07:52.080 10:39:08 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:52.080 10:39:08 -- ../common.sh@73 -- # start_llvm_fuzz 5 1 0x1 00:07:52.080 10:39:08 -- nvmf/run.sh@23 -- # local fuzzer_type=5 00:07:52.080 10:39:08 -- nvmf/run.sh@24 -- # local timen=1 00:07:52.080 10:39:08 -- nvmf/run.sh@25 -- # local core=0x1 00:07:52.080 10:39:08 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:07:52.080 10:39:08 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_5.conf 00:07:52.080 10:39:08 -- nvmf/run.sh@29 -- # printf %02d 5 00:07:52.080 10:39:08 -- nvmf/run.sh@29 -- # port=4405 00:07:52.080 10:39:08 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:07:52.080 10:39:08 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4405' 00:07:52.080 10:39:08 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4405"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:52.080 10:39:08 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4405' -c /tmp/fuzz_json_5.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 -Z 5 -r /var/tmp/spdk5.sock 00:07:52.339 [2024-07-13 10:39:08.468717] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:07:52.339 [2024-07-13 10:39:08.468802] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1991646 ] 00:07:52.339 EAL: No free 2048 kB hugepages reported on node 1 00:07:52.339 [2024-07-13 10:39:08.645459] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:52.339 [2024-07-13 10:39:08.666133] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:52.339 [2024-07-13 10:39:08.666281] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:52.340 [2024-07-13 10:39:08.717969] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:52.599 [2024-07-13 10:39:08.734239] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4405 *** 00:07:52.599 INFO: Running with entropic power schedule (0xFF, 100). 00:07:52.599 INFO: Seed: 2173322683 00:07:52.599 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x26ac74c, 0x26ffca9), 00:07:52.599 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x26ffcb0,0x2c35280), 00:07:52.599 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:07:52.599 INFO: A corpus is not provided, starting from an empty corpus 00:07:52.599 #2 INITED exec/s: 0 rss: 60Mb 00:07:52.599 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:52.599 This may also happen if the target rejected all inputs we tried so far 00:07:52.599 [2024-07-13 10:39:08.789667] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:50505050 cdw11:50500002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.599 [2024-07-13 10:39:08.789696] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.599 [2024-07-13 10:39:08.789749] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:50505050 cdw11:50500002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.599 [2024-07-13 10:39:08.789763] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.859 NEW_FUNC[1/671]: 0x4a6970 in fuzz_admin_create_io_submission_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:142 00:07:52.859 NEW_FUNC[2/671]: 0x4dac50 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:52.859 #19 NEW cov: 11543 ft: 11544 corp: 2/24b lim: 45 exec/s: 0 rss: 68Mb L: 23/23 MS: 2 ShuffleBytes-InsertRepeatedBytes- 00:07:52.859 [2024-07-13 10:39:09.100431] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:50505050 cdw11:50500002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.859 [2024-07-13 10:39:09.100469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.859 [2024-07-13 10:39:09.100525] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:50505050 cdw11:50500002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.859 [2024-07-13 10:39:09.100539] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.859 #25 NEW cov: 11656 ft: 12072 corp: 3/47b lim: 45 exec/s: 0 rss: 68Mb L: 23/23 MS: 1 ChangeBit- 00:07:52.859 [2024-07-13 10:39:09.140611] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00008a00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.859 [2024-07-13 10:39:09.140637] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.859 [2024-07-13 10:39:09.140692] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.859 [2024-07-13 10:39:09.140706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.859 [2024-07-13 10:39:09.140758] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.859 [2024-07-13 10:39:09.140773] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.859 #28 NEW cov: 11662 ft: 12561 corp: 4/81b lim: 45 exec/s: 0 rss: 68Mb L: 34/34 MS: 3 ShuffleBytes-ChangeBit-InsertRepeatedBytes- 00:07:52.859 [2024-07-13 10:39:09.180879] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:01008a00 cdw11:00020000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.859 [2024-07-13 10:39:09.180907] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.859 [2024-07-13 10:39:09.180976] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.859 [2024-07-13 10:39:09.180990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.859 [2024-07-13 10:39:09.181040] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.859 [2024-07-13 10:39:09.181053] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.859 [2024-07-13 10:39:09.181103] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.859 [2024-07-13 10:39:09.181117] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:52.859 #29 NEW cov: 11747 ft: 13098 corp: 5/119b lim: 45 exec/s: 0 rss: 68Mb L: 38/38 MS: 1 CMP- DE: "\001\000\000\002"- 00:07:52.859 [2024-07-13 10:39:09.220683] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:50255050 cdw11:50500002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.859 [2024-07-13 10:39:09.220709] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.859 [2024-07-13 10:39:09.220779] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:50505050 cdw11:50500002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.859 [2024-07-13 10:39:09.220792] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.119 #30 NEW cov: 11747 ft: 13230 corp: 6/142b lim: 45 exec/s: 0 rss: 68Mb L: 23/38 MS: 1 ChangeByte- 00:07:53.119 [2024-07-13 10:39:09.261062] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:01008a00 cdw11:00020000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.119 [2024-07-13 10:39:09.261088] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.119 [2024-07-13 10:39:09.261143] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.119 [2024-07-13 10:39:09.261157] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.119 [2024-07-13 10:39:09.261208] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.120 [2024-07-13 10:39:09.261221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.120 [2024-07-13 10:39:09.261273] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.120 [2024-07-13 10:39:09.261285] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:53.120 #31 NEW cov: 11747 ft: 13308 corp: 7/180b lim: 45 exec/s: 0 rss: 69Mb L: 38/38 MS: 1 ShuffleBytes- 00:07:53.120 [2024-07-13 10:39:09.301107] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00008a00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.120 [2024-07-13 10:39:09.301133] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.120 [2024-07-13 10:39:09.301204] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.120 [2024-07-13 10:39:09.301222] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.120 [2024-07-13 10:39:09.301276] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.120 [2024-07-13 10:39:09.301290] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.120 #32 NEW cov: 11747 ft: 13358 corp: 8/214b lim: 45 exec/s: 0 rss: 69Mb L: 34/38 MS: 1 PersAutoDict- DE: "\001\000\000\002"- 00:07:53.120 [2024-07-13 10:39:09.341334] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:01008a00 cdw11:00020000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.120 [2024-07-13 10:39:09.341359] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.120 [2024-07-13 10:39:09.341409] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.120 [2024-07-13 10:39:09.341423] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.120 [2024-07-13 10:39:09.341495] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.120 [2024-07-13 10:39:09.341510] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.120 [2024-07-13 10:39:09.341561] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.120 [2024-07-13 10:39:09.341574] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:53.120 #33 NEW cov: 11747 ft: 13396 corp: 9/252b lim: 45 exec/s: 0 rss: 69Mb L: 38/38 MS: 1 ChangeBinInt- 00:07:53.120 [2024-07-13 10:39:09.380964] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.120 [2024-07-13 10:39:09.380989] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.120 #34 NEW cov: 11747 ft: 14177 corp: 10/268b lim: 45 exec/s: 0 rss: 69Mb L: 16/38 MS: 1 CrossOver- 00:07:53.120 [2024-07-13 10:39:09.421553] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00008a00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.120 [2024-07-13 10:39:09.421579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.120 [2024-07-13 10:39:09.421647] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.120 [2024-07-13 10:39:09.421661] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.120 [2024-07-13 10:39:09.421712] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:00020100 cdw11:00020000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.120 [2024-07-13 10:39:09.421726] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.120 [2024-07-13 10:39:09.421779] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.120 [2024-07-13 10:39:09.421795] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:53.120 #35 NEW cov: 11747 ft: 14241 corp: 11/306b lim: 45 exec/s: 0 rss: 69Mb L: 38/38 MS: 1 PersAutoDict- DE: "\001\000\000\002"- 00:07:53.120 [2024-07-13 10:39:09.461768] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00008a00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.120 [2024-07-13 10:39:09.461793] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.120 [2024-07-13 10:39:09.461860] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.120 [2024-07-13 10:39:09.461874] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.120 [2024-07-13 10:39:09.461927] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:00020001 cdw11:00020000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.120 [2024-07-13 10:39:09.461940] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.120 [2024-07-13 10:39:09.461991] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.120 [2024-07-13 10:39:09.462005] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:53.120 #36 NEW cov: 11747 ft: 14270 corp: 12/344b lim: 45 exec/s: 0 rss: 69Mb L: 38/38 MS: 1 ShuffleBytes- 00:07:53.120 [2024-07-13 10:39:09.501821] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00008a00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.120 [2024-07-13 10:39:09.501845] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.120 [2024-07-13 10:39:09.501899] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.120 [2024-07-13 10:39:09.501912] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.120 [2024-07-13 10:39:09.501964] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:00020001 cdw11:00020000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.120 [2024-07-13 10:39:09.501977] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.120 [2024-07-13 10:39:09.502029] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.120 [2024-07-13 10:39:09.502042] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:53.379 #37 NEW cov: 11747 ft: 14279 corp: 13/382b lim: 45 exec/s: 0 rss: 69Mb L: 38/38 MS: 1 ChangeByte- 00:07:53.379 [2024-07-13 10:39:09.541925] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00838a00 cdw11:83830004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.380 [2024-07-13 10:39:09.541951] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.380 [2024-07-13 10:39:09.542020] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.380 [2024-07-13 10:39:09.542033] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.380 [2024-07-13 10:39:09.542085] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:01000000 cdw11:01000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.380 [2024-07-13 10:39:09.542099] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.380 [2024-07-13 10:39:09.542150] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:00000200 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.380 [2024-07-13 10:39:09.542165] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:53.380 #38 NEW cov: 11747 ft: 14364 corp: 14/424b lim: 45 exec/s: 0 rss: 69Mb L: 42/42 MS: 1 InsertRepeatedBytes- 00:07:53.380 [2024-07-13 10:39:09.582063] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:8a000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.380 [2024-07-13 10:39:09.582088] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.380 [2024-07-13 10:39:09.582140] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.380 [2024-07-13 10:39:09.582154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.380 [2024-07-13 10:39:09.582205] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:00020001 cdw11:00020000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.380 [2024-07-13 10:39:09.582218] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.380 [2024-07-13 10:39:09.582269] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.380 [2024-07-13 10:39:09.582282] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:53.380 #39 NEW cov: 11747 ft: 14383 corp: 15/462b lim: 45 exec/s: 0 rss: 69Mb L: 38/42 MS: 1 ShuffleBytes- 00:07:53.380 [2024-07-13 10:39:09.622162] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:01008a00 cdw11:00020000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.380 [2024-07-13 10:39:09.622187] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.380 [2024-07-13 10:39:09.622239] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.380 [2024-07-13 10:39:09.622253] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.380 [2024-07-13 10:39:09.622304] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.380 [2024-07-13 10:39:09.622317] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.380 [2024-07-13 10:39:09.622369] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:45000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.380 [2024-07-13 10:39:09.622382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:53.380 #40 NEW cov: 11747 ft: 14402 corp: 16/500b lim: 45 exec/s: 0 rss: 69Mb L: 38/42 MS: 1 ChangeByte- 00:07:53.380 [2024-07-13 10:39:09.661840] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:50505050 cdw11:50500002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.380 [2024-07-13 10:39:09.661866] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.380 NEW_FUNC[1/1]: 0x197bcf0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:53.380 #41 NEW cov: 11770 ft: 14501 corp: 17/516b lim: 45 exec/s: 0 rss: 69Mb L: 16/42 MS: 1 EraseBytes- 00:07:53.380 [2024-07-13 10:39:09.701966] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:50505050 cdw11:50500002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.380 [2024-07-13 10:39:09.701993] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.380 #42 NEW cov: 11770 ft: 14517 corp: 18/532b lim: 45 exec/s: 0 rss: 69Mb L: 16/42 MS: 1 ShuffleBytes- 00:07:53.380 [2024-07-13 10:39:09.742386] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:50505050 cdw11:50500002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.380 [2024-07-13 10:39:09.742411] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.380 [2024-07-13 10:39:09.742468] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00500000 cdw11:50500002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.380 [2024-07-13 10:39:09.742482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.380 [2024-07-13 10:39:09.742533] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:50505050 cdw11:50500002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.380 [2024-07-13 10:39:09.742546] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.380 #43 NEW cov: 11770 ft: 14535 corp: 19/559b lim: 45 exec/s: 0 rss: 69Mb L: 27/42 MS: 1 CrossOver- 00:07:53.639 [2024-07-13 10:39:09.782362] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:50505050 cdw11:50500002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.639 [2024-07-13 10:39:09.782389] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.639 [2024-07-13 10:39:09.782449] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:afafb7af cdw11:50500002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.639 [2024-07-13 10:39:09.782464] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.639 #44 NEW cov: 11770 ft: 14553 corp: 20/582b lim: 45 exec/s: 44 rss: 69Mb L: 23/42 MS: 1 ChangeBinInt- 00:07:53.639 [2024-07-13 10:39:09.822596] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00008a00 cdw11:00010000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.639 [2024-07-13 10:39:09.822622] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.639 [2024-07-13 10:39:09.822676] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.639 [2024-07-13 10:39:09.822691] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.639 [2024-07-13 10:39:09.822743] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.639 [2024-07-13 10:39:09.822757] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.639 #45 NEW cov: 11770 ft: 14572 corp: 21/616b lim: 45 exec/s: 45 rss: 69Mb L: 34/42 MS: 1 PersAutoDict- DE: "\001\000\000\002"- 00:07:53.639 [2024-07-13 10:39:09.862561] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:50b55050 cdw11:50500002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.639 [2024-07-13 10:39:09.862586] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.639 [2024-07-13 10:39:09.862654] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:50505050 cdw11:50500002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.639 [2024-07-13 10:39:09.862668] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.639 #46 NEW cov: 11770 ft: 14574 corp: 22/639b lim: 45 exec/s: 46 rss: 69Mb L: 23/42 MS: 1 ChangeBinInt- 00:07:53.639 [2024-07-13 10:39:09.892944] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:8a000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.639 [2024-07-13 10:39:09.892969] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.639 [2024-07-13 10:39:09.893022] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.639 [2024-07-13 10:39:09.893035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.639 [2024-07-13 10:39:09.893086] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.639 [2024-07-13 10:39:09.893099] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.639 [2024-07-13 10:39:09.893151] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.639 [2024-07-13 10:39:09.893164] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:53.639 #47 NEW cov: 11770 ft: 14582 corp: 23/677b lim: 45 exec/s: 47 rss: 69Mb L: 38/42 MS: 1 CMP- DE: "\000\000\000\000"- 00:07:53.639 [2024-07-13 10:39:09.932938] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00008a00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.639 [2024-07-13 10:39:09.932962] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.639 [2024-07-13 10:39:09.933016] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.639 [2024-07-13 10:39:09.933029] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.639 [2024-07-13 10:39:09.933098] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.639 [2024-07-13 10:39:09.933112] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.639 #48 NEW cov: 11770 ft: 14603 corp: 24/708b lim: 45 exec/s: 48 rss: 69Mb L: 31/42 MS: 1 EraseBytes- 00:07:53.639 [2024-07-13 10:39:09.972874] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:50255050 cdw11:50500002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.639 [2024-07-13 10:39:09.972899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.639 [2024-07-13 10:39:09.972951] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:50502550 cdw11:50500002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.639 [2024-07-13 10:39:09.972963] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.639 #49 NEW cov: 11770 ft: 14622 corp: 25/731b lim: 45 exec/s: 49 rss: 70Mb L: 23/42 MS: 1 CopyPart- 00:07:53.639 [2024-07-13 10:39:10.013311] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00008a00 cdw11:00010000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.639 [2024-07-13 10:39:10.013341] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.639 [2024-07-13 10:39:10.013402] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.639 [2024-07-13 10:39:10.013419] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.639 [2024-07-13 10:39:10.013490] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:68b40129 cdw11:12010005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.639 [2024-07-13 10:39:10.013506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.639 [2024-07-13 10:39:10.013566] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.639 [2024-07-13 10:39:10.013583] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:53.899 #50 NEW cov: 11770 ft: 14730 corp: 26/773b lim: 45 exec/s: 50 rss: 70Mb L: 42/42 MS: 1 CMP- DE: "\001)h\264\022\001\245("- 00:07:53.899 [2024-07-13 10:39:10.063149] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:50505050 cdw11:50500002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.899 [2024-07-13 10:39:10.063176] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.899 [2024-07-13 10:39:10.063229] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:afafb7af cdw11:50500002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.899 [2024-07-13 10:39:10.063243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.899 #51 NEW cov: 11770 ft: 14758 corp: 27/796b lim: 45 exec/s: 51 rss: 70Mb L: 23/42 MS: 1 ChangeBit- 00:07:53.899 [2024-07-13 10:39:10.123627] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00008a00 cdw11:00010000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.899 [2024-07-13 10:39:10.123653] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.899 [2024-07-13 10:39:10.123704] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.899 [2024-07-13 10:39:10.123718] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.899 [2024-07-13 10:39:10.123769] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:68b40129 cdw11:12010005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.899 [2024-07-13 10:39:10.123782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.899 [2024-07-13 10:39:10.123833] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:02000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.899 [2024-07-13 10:39:10.123846] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:53.899 #52 NEW cov: 11770 ft: 14822 corp: 28/838b lim: 45 exec/s: 52 rss: 70Mb L: 42/42 MS: 1 ChangeBit- 00:07:53.899 [2024-07-13 10:39:10.163665] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:8a000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.899 [2024-07-13 10:39:10.163690] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.899 [2024-07-13 10:39:10.163743] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.900 [2024-07-13 10:39:10.163756] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.900 [2024-07-13 10:39:10.163809] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:01000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.900 [2024-07-13 10:39:10.163822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.900 [2024-07-13 10:39:10.163876] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:08000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.900 [2024-07-13 10:39:10.163889] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:53.900 #53 NEW cov: 11770 ft: 14851 corp: 29/876b lim: 45 exec/s: 53 rss: 70Mb L: 38/42 MS: 1 ChangeBinInt- 00:07:53.900 [2024-07-13 10:39:10.203682] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00008a00 cdw11:00010000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.900 [2024-07-13 10:39:10.203707] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.900 [2024-07-13 10:39:10.203760] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.900 [2024-07-13 10:39:10.203774] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.900 [2024-07-13 10:39:10.203826] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.900 [2024-07-13 10:39:10.203856] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.900 #54 NEW cov: 11770 ft: 14857 corp: 30/910b lim: 45 exec/s: 54 rss: 70Mb L: 34/42 MS: 1 CopyPart- 00:07:53.900 [2024-07-13 10:39:10.243909] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00008a00 cdw11:00010000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.900 [2024-07-13 10:39:10.243934] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.900 [2024-07-13 10:39:10.243988] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.900 [2024-07-13 10:39:10.244001] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.900 [2024-07-13 10:39:10.244054] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:68b40129 cdw11:12010005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.900 [2024-07-13 10:39:10.244067] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.900 [2024-07-13 10:39:10.244117] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.900 [2024-07-13 10:39:10.244130] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:53.900 #55 NEW cov: 11770 ft: 14864 corp: 31/953b lim: 45 exec/s: 55 rss: 70Mb L: 43/43 MS: 1 InsertByte- 00:07:53.900 [2024-07-13 10:39:10.283869] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00008a00 cdw11:00010000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.900 [2024-07-13 10:39:10.283895] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.900 [2024-07-13 10:39:10.283949] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.900 [2024-07-13 10:39:10.283963] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.900 [2024-07-13 10:39:10.284016] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.900 [2024-07-13 10:39:10.284029] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.159 #56 NEW cov: 11770 ft: 14873 corp: 32/987b lim: 45 exec/s: 56 rss: 70Mb L: 34/43 MS: 1 CopyPart- 00:07:54.159 [2024-07-13 10:39:10.323835] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:50505050 cdw11:50500002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.159 [2024-07-13 10:39:10.323860] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.160 [2024-07-13 10:39:10.323914] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:502e5050 cdw11:50500002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.160 [2024-07-13 10:39:10.323928] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.160 #57 NEW cov: 11770 ft: 14884 corp: 33/1010b lim: 45 exec/s: 57 rss: 70Mb L: 23/43 MS: 1 ChangeByte- 00:07:54.160 [2024-07-13 10:39:10.354204] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00008a00 cdw11:00010000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.160 [2024-07-13 10:39:10.354231] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.160 [2024-07-13 10:39:10.354302] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000200 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.160 [2024-07-13 10:39:10.354318] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.160 [2024-07-13 10:39:10.354372] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:29680001 cdw11:b4120000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.160 [2024-07-13 10:39:10.354387] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.160 [2024-07-13 10:39:10.354449] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.160 [2024-07-13 10:39:10.354466] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:54.160 #58 NEW cov: 11770 ft: 14933 corp: 34/1054b lim: 45 exec/s: 58 rss: 70Mb L: 44/44 MS: 1 InsertByte- 00:07:54.160 [2024-07-13 10:39:10.394336] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00008a00 cdw11:00010000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.160 [2024-07-13 10:39:10.394361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.160 [2024-07-13 10:39:10.394430] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.160 [2024-07-13 10:39:10.394447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.160 [2024-07-13 10:39:10.394500] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:68b40129 cdw11:12010005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.160 [2024-07-13 10:39:10.394514] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.160 [2024-07-13 10:39:10.394566] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:02000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.160 [2024-07-13 10:39:10.394580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:54.160 #59 NEW cov: 11773 ft: 15135 corp: 35/1096b lim: 45 exec/s: 59 rss: 70Mb L: 42/44 MS: 1 ShuffleBytes- 00:07:54.160 [2024-07-13 10:39:10.434312] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:50505050 cdw11:50500002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.160 [2024-07-13 10:39:10.434341] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.160 [2024-07-13 10:39:10.434396] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:feff0000 cdw11:ffff0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.160 [2024-07-13 10:39:10.434410] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.160 [2024-07-13 10:39:10.434467] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:50505050 cdw11:50500002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.160 [2024-07-13 10:39:10.434481] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.160 #60 NEW cov: 11773 ft: 15148 corp: 36/1127b lim: 45 exec/s: 60 rss: 70Mb L: 31/44 MS: 1 CMP- DE: "\376\377\377\377"- 00:07:54.160 [2024-07-13 10:39:10.474289] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:50505050 cdw11:50500002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.160 [2024-07-13 10:39:10.474315] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.160 [2024-07-13 10:39:10.474369] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:50d05050 cdw11:50500002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.160 [2024-07-13 10:39:10.474382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.160 #61 NEW cov: 11773 ft: 15159 corp: 37/1150b lim: 45 exec/s: 61 rss: 70Mb L: 23/44 MS: 1 ChangeByte- 00:07:54.160 [2024-07-13 10:39:10.514711] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00008a00 cdw11:00010000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.160 [2024-07-13 10:39:10.514736] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.160 [2024-07-13 10:39:10.514788] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.160 [2024-07-13 10:39:10.514801] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.160 [2024-07-13 10:39:10.514838] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:68b40129 cdw11:12010005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.160 [2024-07-13 10:39:10.514851] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.160 [2024-07-13 10:39:10.514902] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:d4000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.160 [2024-07-13 10:39:10.514914] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:54.160 #62 NEW cov: 11773 ft: 15168 corp: 38/1194b lim: 45 exec/s: 62 rss: 70Mb L: 44/44 MS: 1 InsertByte- 00:07:54.420 [2024-07-13 10:39:10.554658] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00008a00 cdw11:00010000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.420 [2024-07-13 10:39:10.554684] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.420 [2024-07-13 10:39:10.554739] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.420 [2024-07-13 10:39:10.554753] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.420 [2024-07-13 10:39:10.554805] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:00fb0000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.420 [2024-07-13 10:39:10.554822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.420 #63 NEW cov: 11773 ft: 15193 corp: 39/1228b lim: 45 exec/s: 63 rss: 70Mb L: 34/44 MS: 1 ChangeBinInt- 00:07:54.420 [2024-07-13 10:39:10.594957] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00008a00 cdw11:00010000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.420 [2024-07-13 10:39:10.594982] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.420 [2024-07-13 10:39:10.595037] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.420 [2024-07-13 10:39:10.595051] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.420 [2024-07-13 10:39:10.595101] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:68b40129 cdw11:12010005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.420 [2024-07-13 10:39:10.595114] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.420 [2024-07-13 10:39:10.595166] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:02000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.420 [2024-07-13 10:39:10.595181] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:54.420 #64 NEW cov: 11773 ft: 15203 corp: 40/1270b lim: 45 exec/s: 64 rss: 70Mb L: 42/44 MS: 1 ShuffleBytes- 00:07:54.420 [2024-07-13 10:39:10.634961] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:50505050 cdw11:50500002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.420 [2024-07-13 10:39:10.634986] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.420 [2024-07-13 10:39:10.635039] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:afafb7af cdw11:50500002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.420 [2024-07-13 10:39:10.635052] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.420 [2024-07-13 10:39:10.635104] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:50505050 cdw11:0a000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.420 [2024-07-13 10:39:10.635118] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.420 #65 NEW cov: 11773 ft: 15258 corp: 41/1297b lim: 45 exec/s: 65 rss: 70Mb L: 27/44 MS: 1 InsertRepeatedBytes- 00:07:54.420 [2024-07-13 10:39:10.675210] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:50505050 cdw11:50500002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.420 [2024-07-13 10:39:10.675235] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.420 [2024-07-13 10:39:10.675288] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:50505050 cdw11:50500002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.420 [2024-07-13 10:39:10.675301] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.420 [2024-07-13 10:39:10.675353] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:5050502e cdw11:50500002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.420 [2024-07-13 10:39:10.675366] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.420 [2024-07-13 10:39:10.675417] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:502e5050 cdw11:50500002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.420 [2024-07-13 10:39:10.675433] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:54.420 #66 NEW cov: 11773 ft: 15276 corp: 42/1338b lim: 45 exec/s: 66 rss: 70Mb L: 41/44 MS: 1 CopyPart- 00:07:54.420 [2024-07-13 10:39:10.715316] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:01008a00 cdw11:00020000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.420 [2024-07-13 10:39:10.715340] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.420 [2024-07-13 10:39:10.715394] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00940000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.420 [2024-07-13 10:39:10.715407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.420 [2024-07-13 10:39:10.715455] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.420 [2024-07-13 10:39:10.715468] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.420 [2024-07-13 10:39:10.715517] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:45000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.420 [2024-07-13 10:39:10.715529] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:54.420 #67 NEW cov: 11773 ft: 15285 corp: 43/1376b lim: 45 exec/s: 67 rss: 70Mb L: 38/44 MS: 1 ChangeByte- 00:07:54.420 [2024-07-13 10:39:10.755132] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:50b55050 cdw11:50500002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.420 [2024-07-13 10:39:10.755157] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.420 [2024-07-13 10:39:10.755208] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:50505050 cdw11:50500002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.420 [2024-07-13 10:39:10.755221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.420 #68 NEW cov: 11773 ft: 15292 corp: 44/1399b lim: 45 exec/s: 34 rss: 70Mb L: 23/44 MS: 1 ShuffleBytes- 00:07:54.420 #68 DONE cov: 11773 ft: 15292 corp: 44/1399b lim: 45 exec/s: 34 rss: 70Mb 00:07:54.420 ###### Recommended dictionary. ###### 00:07:54.420 "\001\000\000\002" # Uses: 3 00:07:54.420 "\000\000\000\000" # Uses: 0 00:07:54.420 "\001)h\264\022\001\245(" # Uses: 0 00:07:54.420 "\376\377\377\377" # Uses: 0 00:07:54.420 ###### End of recommended dictionary. ###### 00:07:54.420 Done 68 runs in 2 second(s) 00:07:54.680 10:39:10 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_5.conf 00:07:54.680 10:39:10 -- ../common.sh@72 -- # (( i++ )) 00:07:54.680 10:39:10 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:54.680 10:39:10 -- ../common.sh@73 -- # start_llvm_fuzz 6 1 0x1 00:07:54.680 10:39:10 -- nvmf/run.sh@23 -- # local fuzzer_type=6 00:07:54.680 10:39:10 -- nvmf/run.sh@24 -- # local timen=1 00:07:54.680 10:39:10 -- nvmf/run.sh@25 -- # local core=0x1 00:07:54.680 10:39:10 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:07:54.680 10:39:10 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_6.conf 00:07:54.680 10:39:10 -- nvmf/run.sh@29 -- # printf %02d 6 00:07:54.680 10:39:10 -- nvmf/run.sh@29 -- # port=4406 00:07:54.680 10:39:10 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:07:54.680 10:39:10 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4406' 00:07:54.680 10:39:10 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4406"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:54.680 10:39:10 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4406' -c /tmp/fuzz_json_6.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 -Z 6 -r /var/tmp/spdk6.sock 00:07:54.680 [2024-07-13 10:39:10.931264] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:07:54.680 [2024-07-13 10:39:10.931336] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1992092 ] 00:07:54.680 EAL: No free 2048 kB hugepages reported on node 1 00:07:54.939 [2024-07-13 10:39:11.116759] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:54.939 [2024-07-13 10:39:11.136627] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:54.939 [2024-07-13 10:39:11.136753] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:54.939 [2024-07-13 10:39:11.188149] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:54.939 [2024-07-13 10:39:11.204462] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4406 *** 00:07:54.939 INFO: Running with entropic power schedule (0xFF, 100). 00:07:54.939 INFO: Seed: 348371500 00:07:54.939 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x26ac74c, 0x26ffca9), 00:07:54.939 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x26ffcb0,0x2c35280), 00:07:54.939 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:07:54.939 INFO: A corpus is not provided, starting from an empty corpus 00:07:54.939 #2 INITED exec/s: 0 rss: 60Mb 00:07:54.939 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:54.939 This may also happen if the target rejected all inputs we tried so far 00:07:54.939 [2024-07-13 10:39:11.280623] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000b9ff cdw11:00000000 00:07:54.939 [2024-07-13 10:39:11.280660] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.939 [2024-07-13 10:39:11.280778] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:54.939 [2024-07-13 10:39:11.280795] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.198 NEW_FUNC[1/669]: 0x4a9180 in fuzz_admin_delete_io_completion_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:161 00:07:55.198 NEW_FUNC[2/669]: 0x4dac50 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:55.198 #6 NEW cov: 11460 ft: 11461 corp: 2/5b lim: 10 exec/s: 0 rss: 68Mb L: 4/4 MS: 4 ChangeByte-ChangeByte-CopyPart-InsertRepeatedBytes- 00:07:55.457 [2024-07-13 10:39:11.611416] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:55.457 [2024-07-13 10:39:11.611470] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.457 #7 NEW cov: 11573 ft: 12203 corp: 3/7b lim: 10 exec/s: 0 rss: 68Mb L: 2/4 MS: 1 CrossOver- 00:07:55.457 [2024-07-13 10:39:11.662268] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:55.457 [2024-07-13 10:39:11.662298] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.457 [2024-07-13 10:39:11.662421] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:55.457 [2024-07-13 10:39:11.662437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.457 [2024-07-13 10:39:11.662565] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:55.457 [2024-07-13 10:39:11.662584] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.457 [2024-07-13 10:39:11.662698] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:55.457 [2024-07-13 10:39:11.662714] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.457 [2024-07-13 10:39:11.662826] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:00000002 cdw11:00000000 00:07:55.457 [2024-07-13 10:39:11.662842] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:55.457 #9 NEW cov: 11579 ft: 12698 corp: 4/17b lim: 10 exec/s: 0 rss: 68Mb L: 10/10 MS: 2 ChangeBit-InsertRepeatedBytes- 00:07:55.457 [2024-07-13 10:39:11.711981] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:55.457 [2024-07-13 10:39:11.712009] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.457 [2024-07-13 10:39:11.712127] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00002b2b cdw11:00000000 00:07:55.457 [2024-07-13 10:39:11.712142] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.457 [2024-07-13 10:39:11.712257] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00002b2b cdw11:00000000 00:07:55.457 [2024-07-13 10:39:11.712272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.457 #10 NEW cov: 11664 ft: 13169 corp: 5/24b lim: 10 exec/s: 0 rss: 68Mb L: 7/10 MS: 1 InsertRepeatedBytes- 00:07:55.457 [2024-07-13 10:39:11.762353] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:55.457 [2024-07-13 10:39:11.762382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.457 [2024-07-13 10:39:11.762523] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00002b2b cdw11:00000000 00:07:55.457 [2024-07-13 10:39:11.762540] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.457 [2024-07-13 10:39:11.762656] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00002b24 cdw11:00000000 00:07:55.457 [2024-07-13 10:39:11.762672] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.457 [2024-07-13 10:39:11.762786] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00002b2b cdw11:00000000 00:07:55.457 [2024-07-13 10:39:11.762805] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.457 #11 NEW cov: 11664 ft: 13272 corp: 6/32b lim: 10 exec/s: 0 rss: 68Mb L: 8/10 MS: 1 InsertByte- 00:07:55.457 [2024-07-13 10:39:11.811900] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000aab cdw11:00000000 00:07:55.457 [2024-07-13 10:39:11.811926] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.457 #13 NEW cov: 11664 ft: 13308 corp: 7/34b lim: 10 exec/s: 0 rss: 68Mb L: 2/10 MS: 2 ShuffleBytes-InsertByte- 00:07:55.716 [2024-07-13 10:39:11.862099] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a2b cdw11:00000000 00:07:55.716 [2024-07-13 10:39:11.862126] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.716 #14 NEW cov: 11664 ft: 13544 corp: 8/36b lim: 10 exec/s: 0 rss: 68Mb L: 2/10 MS: 1 ChangeBit- 00:07:55.716 [2024-07-13 10:39:11.912667] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000aff cdw11:00000000 00:07:55.716 [2024-07-13 10:39:11.912697] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.716 [2024-07-13 10:39:11.912823] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:55.716 [2024-07-13 10:39:11.912841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.716 [2024-07-13 10:39:11.912955] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:55.716 [2024-07-13 10:39:11.912975] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.716 #15 NEW cov: 11664 ft: 13568 corp: 9/43b lim: 10 exec/s: 0 rss: 68Mb L: 7/10 MS: 1 InsertRepeatedBytes- 00:07:55.716 [2024-07-13 10:39:11.963093] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:55.716 [2024-07-13 10:39:11.963121] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.716 [2024-07-13 10:39:11.963240] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:55.716 [2024-07-13 10:39:11.963258] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.716 [2024-07-13 10:39:11.963379] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00002b2b cdw11:00000000 00:07:55.716 [2024-07-13 10:39:11.963396] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.716 [2024-07-13 10:39:11.963539] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00002b2b cdw11:00000000 00:07:55.716 [2024-07-13 10:39:11.963557] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.716 #16 NEW cov: 11664 ft: 13626 corp: 10/52b lim: 10 exec/s: 0 rss: 69Mb L: 9/10 MS: 1 CrossOver- 00:07:55.716 [2024-07-13 10:39:12.013043] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000aff cdw11:00000000 00:07:55.716 [2024-07-13 10:39:12.013071] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.716 [2024-07-13 10:39:12.013207] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ff0a cdw11:00000000 00:07:55.716 [2024-07-13 10:39:12.013225] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.716 [2024-07-13 10:39:12.013342] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:55.716 [2024-07-13 10:39:12.013362] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.716 #17 NEW cov: 11664 ft: 13683 corp: 11/59b lim: 10 exec/s: 0 rss: 69Mb L: 7/10 MS: 1 CrossOver- 00:07:55.716 [2024-07-13 10:39:12.063295] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:55.716 [2024-07-13 10:39:12.063325] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.716 [2024-07-13 10:39:12.063459] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00002b2b cdw11:00000000 00:07:55.716 [2024-07-13 10:39:12.063477] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.716 [2024-07-13 10:39:12.063614] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00002b2b cdw11:00000000 00:07:55.716 [2024-07-13 10:39:12.063634] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.716 #18 NEW cov: 11664 ft: 13765 corp: 12/66b lim: 10 exec/s: 0 rss: 69Mb L: 7/10 MS: 1 ShuffleBytes- 00:07:55.975 [2024-07-13 10:39:12.113875] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:55.975 [2024-07-13 10:39:12.113904] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.975 [2024-07-13 10:39:12.113997] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:55.975 [2024-07-13 10:39:12.114016] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.975 [2024-07-13 10:39:12.114133] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000002b cdw11:00000000 00:07:55.975 [2024-07-13 10:39:12.114152] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.975 [2024-07-13 10:39:12.114270] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:55.975 [2024-07-13 10:39:12.114288] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.975 [2024-07-13 10:39:12.114404] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:00000002 cdw11:00000000 00:07:55.975 [2024-07-13 10:39:12.114423] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:55.975 NEW_FUNC[1/1]: 0x197bcf0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:55.975 #19 NEW cov: 11687 ft: 13817 corp: 13/76b lim: 10 exec/s: 0 rss: 69Mb L: 10/10 MS: 1 ChangeByte- 00:07:55.975 [2024-07-13 10:39:12.173336] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000b9ff cdw11:00000000 00:07:55.975 [2024-07-13 10:39:12.173365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.975 [2024-07-13 10:39:12.173492] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000faff cdw11:00000000 00:07:55.975 [2024-07-13 10:39:12.173511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.975 #20 NEW cov: 11687 ft: 13856 corp: 14/80b lim: 10 exec/s: 0 rss: 69Mb L: 4/10 MS: 1 ChangeBinInt- 00:07:55.975 [2024-07-13 10:39:12.234153] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:55.975 [2024-07-13 10:39:12.234183] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.975 [2024-07-13 10:39:12.234310] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:55.975 [2024-07-13 10:39:12.234328] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.975 [2024-07-13 10:39:12.234453] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:55.975 [2024-07-13 10:39:12.234472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.975 [2024-07-13 10:39:12.234554] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:55.975 [2024-07-13 10:39:12.234572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.976 [2024-07-13 10:39:12.234695] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:00000002 cdw11:00000000 00:07:55.976 [2024-07-13 10:39:12.234717] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:55.976 #21 NEW cov: 11687 ft: 13875 corp: 15/90b lim: 10 exec/s: 21 rss: 69Mb L: 10/10 MS: 1 ChangeBinInt- 00:07:55.976 [2024-07-13 10:39:12.283901] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:55.976 [2024-07-13 10:39:12.283929] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.976 [2024-07-13 10:39:12.284065] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00002b07 cdw11:00000000 00:07:55.976 [2024-07-13 10:39:12.284082] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.976 [2024-07-13 10:39:12.284209] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000002b cdw11:00000000 00:07:55.976 [2024-07-13 10:39:12.284226] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.976 #22 NEW cov: 11687 ft: 13885 corp: 16/97b lim: 10 exec/s: 22 rss: 69Mb L: 7/10 MS: 1 ChangeBinInt- 00:07:55.976 [2024-07-13 10:39:12.333834] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000b9ff cdw11:00000000 00:07:55.976 [2024-07-13 10:39:12.333866] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.976 [2024-07-13 10:39:12.333990] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffef cdw11:00000000 00:07:55.976 [2024-07-13 10:39:12.334009] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.976 #23 NEW cov: 11687 ft: 13906 corp: 17/101b lim: 10 exec/s: 23 rss: 69Mb L: 4/10 MS: 1 ChangeBit- 00:07:56.235 [2024-07-13 10:39:12.383807] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00002eb9 cdw11:00000000 00:07:56.235 [2024-07-13 10:39:12.383835] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.235 #25 NEW cov: 11687 ft: 13971 corp: 18/103b lim: 10 exec/s: 25 rss: 69Mb L: 2/10 MS: 2 CrossOver-InsertByte- 00:07:56.235 [2024-07-13 10:39:12.434824] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:56.235 [2024-07-13 10:39:12.434852] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.235 [2024-07-13 10:39:12.434975] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000000a cdw11:00000000 00:07:56.235 [2024-07-13 10:39:12.434991] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.235 [2024-07-13 10:39:12.435120] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:56.235 [2024-07-13 10:39:12.435137] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.235 [2024-07-13 10:39:12.435252] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:56.235 [2024-07-13 10:39:12.435269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.235 [2024-07-13 10:39:12.435387] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:00000002 cdw11:00000000 00:07:56.235 [2024-07-13 10:39:12.435401] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:56.235 #26 NEW cov: 11687 ft: 13993 corp: 19/113b lim: 10 exec/s: 26 rss: 69Mb L: 10/10 MS: 1 ChangeBinInt- 00:07:56.235 [2024-07-13 10:39:12.484812] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:56.235 [2024-07-13 10:39:12.484843] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.235 [2024-07-13 10:39:12.484960] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:56.235 [2024-07-13 10:39:12.484976] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.235 [2024-07-13 10:39:12.485101] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00002b2b cdw11:00000000 00:07:56.235 [2024-07-13 10:39:12.485120] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.235 [2024-07-13 10:39:12.485236] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00002b2b cdw11:00000000 00:07:56.235 [2024-07-13 10:39:12.485253] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.235 #27 NEW cov: 11687 ft: 14023 corp: 20/122b lim: 10 exec/s: 27 rss: 69Mb L: 9/10 MS: 1 CMP- DE: "\000\000"- 00:07:56.235 [2024-07-13 10:39:12.535015] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:56.235 [2024-07-13 10:39:12.535046] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.235 [2024-07-13 10:39:12.535169] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:56.235 [2024-07-13 10:39:12.535187] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.235 [2024-07-13 10:39:12.535309] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00002b2b cdw11:00000000 00:07:56.235 [2024-07-13 10:39:12.535328] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.235 [2024-07-13 10:39:12.535452] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00002b2b cdw11:00000000 00:07:56.235 [2024-07-13 10:39:12.535469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.235 #28 NEW cov: 11687 ft: 14039 corp: 21/131b lim: 10 exec/s: 28 rss: 69Mb L: 9/10 MS: 1 ShuffleBytes- 00:07:56.235 [2024-07-13 10:39:12.594711] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00009393 cdw11:00000000 00:07:56.235 [2024-07-13 10:39:12.594740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.235 [2024-07-13 10:39:12.594864] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000930a cdw11:00000000 00:07:56.235 [2024-07-13 10:39:12.594881] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.235 #29 NEW cov: 11687 ft: 14056 corp: 22/135b lim: 10 exec/s: 29 rss: 69Mb L: 4/10 MS: 1 InsertRepeatedBytes- 00:07:56.494 [2024-07-13 10:39:12.644819] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00000000 00:07:56.494 [2024-07-13 10:39:12.644848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.494 [2024-07-13 10:39:12.644968] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:000000ab cdw11:00000000 00:07:56.494 [2024-07-13 10:39:12.644984] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.494 #30 NEW cov: 11687 ft: 14125 corp: 23/140b lim: 10 exec/s: 30 rss: 69Mb L: 5/10 MS: 1 CrossOver- 00:07:56.494 [2024-07-13 10:39:12.695302] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:56.494 [2024-07-13 10:39:12.695329] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.494 [2024-07-13 10:39:12.695458] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00002b07 cdw11:00000000 00:07:56.494 [2024-07-13 10:39:12.695476] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.494 [2024-07-13 10:39:12.695591] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:000000c2 cdw11:00000000 00:07:56.494 [2024-07-13 10:39:12.695609] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.494 #31 NEW cov: 11687 ft: 14154 corp: 24/147b lim: 10 exec/s: 31 rss: 69Mb L: 7/10 MS: 1 ChangeByte- 00:07:56.494 [2024-07-13 10:39:12.745903] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:56.494 [2024-07-13 10:39:12.745931] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.494 [2024-07-13 10:39:12.746055] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:56.494 [2024-07-13 10:39:12.746074] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.494 [2024-07-13 10:39:12.746188] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000025 cdw11:00000000 00:07:56.494 [2024-07-13 10:39:12.746207] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.494 [2024-07-13 10:39:12.746329] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:56.494 [2024-07-13 10:39:12.746344] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.494 [2024-07-13 10:39:12.746461] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:00000002 cdw11:00000000 00:07:56.495 [2024-07-13 10:39:12.746477] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:56.495 #32 NEW cov: 11687 ft: 14174 corp: 25/157b lim: 10 exec/s: 32 rss: 69Mb L: 10/10 MS: 1 ChangeByte- 00:07:56.495 [2024-07-13 10:39:12.805599] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:56.495 [2024-07-13 10:39:12.805628] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.495 [2024-07-13 10:39:12.805754] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00002b2b cdw11:00000000 00:07:56.495 [2024-07-13 10:39:12.805773] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.495 [2024-07-13 10:39:12.805893] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000242b cdw11:00000000 00:07:56.495 [2024-07-13 10:39:12.805911] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.495 #33 NEW cov: 11687 ft: 14192 corp: 26/164b lim: 10 exec/s: 33 rss: 70Mb L: 7/10 MS: 1 EraseBytes- 00:07:56.495 [2024-07-13 10:39:12.865609] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00002eb9 cdw11:00000000 00:07:56.495 [2024-07-13 10:39:12.865637] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.495 [2024-07-13 10:39:12.865776] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000930a cdw11:00000000 00:07:56.495 [2024-07-13 10:39:12.865796] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.754 #34 NEW cov: 11687 ft: 14231 corp: 27/168b lim: 10 exec/s: 34 rss: 70Mb L: 4/10 MS: 1 CrossOver- 00:07:56.754 [2024-07-13 10:39:12.916100] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:56.754 [2024-07-13 10:39:12.916129] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.754 [2024-07-13 10:39:12.916263] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000a3d cdw11:00000000 00:07:56.754 [2024-07-13 10:39:12.916279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.754 [2024-07-13 10:39:12.916396] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00002b2b cdw11:00000000 00:07:56.754 [2024-07-13 10:39:12.916413] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.754 [2024-07-13 10:39:12.916544] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00002b2b cdw11:00000000 00:07:56.754 [2024-07-13 10:39:12.916561] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.754 #35 NEW cov: 11687 ft: 14256 corp: 28/177b lim: 10 exec/s: 35 rss: 70Mb L: 9/10 MS: 1 ChangeByte- 00:07:56.754 [2024-07-13 10:39:12.975720] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00009393 cdw11:00000000 00:07:56.754 [2024-07-13 10:39:12.975748] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.754 #36 NEW cov: 11687 ft: 14296 corp: 29/180b lim: 10 exec/s: 36 rss: 70Mb L: 3/10 MS: 1 EraseBytes- 00:07:56.754 [2024-07-13 10:39:13.026750] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00000000 00:07:56.754 [2024-07-13 10:39:13.026777] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.754 [2024-07-13 10:39:13.026907] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:56.754 [2024-07-13 10:39:13.026925] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.754 [2024-07-13 10:39:13.027038] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:56.754 [2024-07-13 10:39:13.027057] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.754 [2024-07-13 10:39:13.027168] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:56.754 [2024-07-13 10:39:13.027185] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.754 [2024-07-13 10:39:13.027301] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:00000002 cdw11:00000000 00:07:56.754 [2024-07-13 10:39:13.027318] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:56.754 #37 NEW cov: 11687 ft: 14319 corp: 30/190b lim: 10 exec/s: 37 rss: 70Mb L: 10/10 MS: 1 ChangeBinInt- 00:07:56.754 [2024-07-13 10:39:13.076668] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:56.754 [2024-07-13 10:39:13.076696] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.754 [2024-07-13 10:39:13.076810] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:56.754 [2024-07-13 10:39:13.076829] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.754 [2024-07-13 10:39:13.076957] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00002b32 cdw11:00000000 00:07:56.754 [2024-07-13 10:39:13.076973] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.754 [2024-07-13 10:39:13.077104] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00002b2b cdw11:00000000 00:07:56.754 [2024-07-13 10:39:13.077122] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.754 #38 NEW cov: 11687 ft: 14338 corp: 31/199b lim: 10 exec/s: 38 rss: 70Mb L: 9/10 MS: 1 ChangeByte- 00:07:56.754 [2024-07-13 10:39:13.126407] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000b9ff cdw11:00000000 00:07:56.754 [2024-07-13 10:39:13.126434] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.754 [2024-07-13 10:39:13.126558] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00008eff cdw11:00000000 00:07:56.754 [2024-07-13 10:39:13.126574] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.013 #39 NEW cov: 11687 ft: 14420 corp: 32/203b lim: 10 exec/s: 39 rss: 70Mb L: 4/10 MS: 1 ChangeByte- 00:07:57.013 [2024-07-13 10:39:13.176440] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:57.013 [2024-07-13 10:39:13.176488] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.013 [2024-07-13 10:39:13.176620] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000abab cdw11:00000000 00:07:57.013 [2024-07-13 10:39:13.176636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.013 #40 NEW cov: 11687 ft: 14421 corp: 33/207b lim: 10 exec/s: 40 rss: 70Mb L: 4/10 MS: 1 CopyPart- 00:07:57.013 [2024-07-13 10:39:13.226773] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00002e05 cdw11:00000000 00:07:57.013 [2024-07-13 10:39:13.226802] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.013 [2024-07-13 10:39:13.226921] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000b993 cdw11:00000000 00:07:57.013 [2024-07-13 10:39:13.226939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.013 #41 NEW cov: 11687 ft: 14429 corp: 34/212b lim: 10 exec/s: 20 rss: 70Mb L: 5/10 MS: 1 InsertByte- 00:07:57.013 #41 DONE cov: 11687 ft: 14429 corp: 34/212b lim: 10 exec/s: 20 rss: 70Mb 00:07:57.013 ###### Recommended dictionary. ###### 00:07:57.013 "\000\000" # Uses: 0 00:07:57.013 ###### End of recommended dictionary. ###### 00:07:57.013 Done 41 runs in 2 second(s) 00:07:57.013 10:39:13 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_6.conf 00:07:57.013 10:39:13 -- ../common.sh@72 -- # (( i++ )) 00:07:57.013 10:39:13 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:57.013 10:39:13 -- ../common.sh@73 -- # start_llvm_fuzz 7 1 0x1 00:07:57.013 10:39:13 -- nvmf/run.sh@23 -- # local fuzzer_type=7 00:07:57.013 10:39:13 -- nvmf/run.sh@24 -- # local timen=1 00:07:57.013 10:39:13 -- nvmf/run.sh@25 -- # local core=0x1 00:07:57.013 10:39:13 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:07:57.013 10:39:13 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_7.conf 00:07:57.013 10:39:13 -- nvmf/run.sh@29 -- # printf %02d 7 00:07:57.013 10:39:13 -- nvmf/run.sh@29 -- # port=4407 00:07:57.013 10:39:13 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:07:57.013 10:39:13 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4407' 00:07:57.013 10:39:13 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4407"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:57.013 10:39:13 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4407' -c /tmp/fuzz_json_7.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 -Z 7 -r /var/tmp/spdk7.sock 00:07:57.271 [2024-07-13 10:39:13.408223] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:07:57.271 [2024-07-13 10:39:13.408293] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1992476 ] 00:07:57.271 EAL: No free 2048 kB hugepages reported on node 1 00:07:57.271 [2024-07-13 10:39:13.582217] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:57.271 [2024-07-13 10:39:13.601978] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:57.271 [2024-07-13 10:39:13.602122] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:57.271 [2024-07-13 10:39:13.653898] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:57.530 [2024-07-13 10:39:13.670166] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4407 *** 00:07:57.530 INFO: Running with entropic power schedule (0xFF, 100). 00:07:57.530 INFO: Seed: 2813359954 00:07:57.530 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x26ac74c, 0x26ffca9), 00:07:57.530 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x26ffcb0,0x2c35280), 00:07:57.530 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:07:57.530 INFO: A corpus is not provided, starting from an empty corpus 00:07:57.530 #2 INITED exec/s: 0 rss: 60Mb 00:07:57.530 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:57.530 This may also happen if the target rejected all inputs we tried so far 00:07:57.530 [2024-07-13 10:39:13.715400] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:57.530 [2024-07-13 10:39:13.715428] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.788 NEW_FUNC[1/669]: 0x4a9b70 in fuzz_admin_delete_io_submission_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:172 00:07:57.788 NEW_FUNC[2/669]: 0x4dac50 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:57.788 #3 NEW cov: 11460 ft: 11426 corp: 2/3b lim: 10 exec/s: 0 rss: 68Mb L: 2/2 MS: 1 CrossOver- 00:07:57.788 [2024-07-13 10:39:14.016492] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:57.788 [2024-07-13 10:39:14.016523] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.788 [2024-07-13 10:39:14.016577] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:57.788 [2024-07-13 10:39:14.016590] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.788 [2024-07-13 10:39:14.016640] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:57.788 [2024-07-13 10:39:14.016654] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.788 [2024-07-13 10:39:14.016705] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000000a cdw11:00000000 00:07:57.788 [2024-07-13 10:39:14.016721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.788 #4 NEW cov: 11573 ft: 12136 corp: 3/12b lim: 10 exec/s: 0 rss: 68Mb L: 9/9 MS: 1 InsertRepeatedBytes- 00:07:57.788 [2024-07-13 10:39:14.066145] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a3b cdw11:00000000 00:07:57.788 [2024-07-13 10:39:14.066172] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.788 #5 NEW cov: 11579 ft: 12520 corp: 4/14b lim: 10 exec/s: 0 rss: 68Mb L: 2/9 MS: 1 ChangeByte- 00:07:57.788 [2024-07-13 10:39:14.106303] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a01 cdw11:00000000 00:07:57.788 [2024-07-13 10:39:14.106329] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.788 #6 NEW cov: 11664 ft: 12780 corp: 5/16b lim: 10 exec/s: 0 rss: 68Mb L: 2/9 MS: 1 ChangeBinInt- 00:07:57.788 [2024-07-13 10:39:14.146371] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a01 cdw11:00000000 00:07:57.788 [2024-07-13 10:39:14.146396] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.788 #7 NEW cov: 11664 ft: 12841 corp: 6/18b lim: 10 exec/s: 0 rss: 68Mb L: 2/9 MS: 1 ShuffleBytes- 00:07:58.047 [2024-07-13 10:39:14.186480] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000aa3 cdw11:00000000 00:07:58.047 [2024-07-13 10:39:14.186506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.047 #8 NEW cov: 11664 ft: 12967 corp: 7/20b lim: 10 exec/s: 0 rss: 68Mb L: 2/9 MS: 1 ChangeByte- 00:07:58.047 [2024-07-13 10:39:14.226649] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000020a cdw11:00000000 00:07:58.047 [2024-07-13 10:39:14.226674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.047 #9 NEW cov: 11664 ft: 13006 corp: 8/22b lim: 10 exec/s: 0 rss: 68Mb L: 2/9 MS: 1 ChangeBit- 00:07:58.047 [2024-07-13 10:39:14.256946] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000aeae cdw11:00000000 00:07:58.047 [2024-07-13 10:39:14.256971] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.047 [2024-07-13 10:39:14.257023] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000aeae cdw11:00000000 00:07:58.047 [2024-07-13 10:39:14.257036] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.047 [2024-07-13 10:39:14.257087] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ae02 cdw11:00000000 00:07:58.047 [2024-07-13 10:39:14.257101] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.047 #10 NEW cov: 11664 ft: 13199 corp: 9/29b lim: 10 exec/s: 0 rss: 68Mb L: 7/9 MS: 1 InsertRepeatedBytes- 00:07:58.048 [2024-07-13 10:39:14.296809] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000dbdb cdw11:00000000 00:07:58.048 [2024-07-13 10:39:14.296834] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.048 #15 NEW cov: 11664 ft: 13307 corp: 10/31b lim: 10 exec/s: 0 rss: 68Mb L: 2/9 MS: 5 ChangeBit-CrossOver-ChangeByte-ShuffleBytes-CopyPart- 00:07:58.048 [2024-07-13 10:39:14.336943] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000aa3 cdw11:00000000 00:07:58.048 [2024-07-13 10:39:14.336969] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.048 #16 NEW cov: 11664 ft: 13395 corp: 11/34b lim: 10 exec/s: 0 rss: 69Mb L: 3/9 MS: 1 InsertByte- 00:07:58.048 [2024-07-13 10:39:14.377097] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:58.048 [2024-07-13 10:39:14.377122] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.048 #17 NEW cov: 11664 ft: 13418 corp: 12/36b lim: 10 exec/s: 0 rss: 69Mb L: 2/9 MS: 1 CrossOver- 00:07:58.048 [2024-07-13 10:39:14.407626] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:58.048 [2024-07-13 10:39:14.407651] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.048 [2024-07-13 10:39:14.407701] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:58.048 [2024-07-13 10:39:14.407715] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.048 [2024-07-13 10:39:14.407763] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00003100 cdw11:00000000 00:07:58.048 [2024-07-13 10:39:14.407777] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.048 [2024-07-13 10:39:14.407827] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:58.048 [2024-07-13 10:39:14.407840] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.048 [2024-07-13 10:39:14.407891] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:58.048 [2024-07-13 10:39:14.407904] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:58.048 #18 NEW cov: 11664 ft: 13504 corp: 13/46b lim: 10 exec/s: 0 rss: 69Mb L: 10/10 MS: 1 InsertByte- 00:07:58.307 [2024-07-13 10:39:14.447318] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:58.307 [2024-07-13 10:39:14.447343] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.307 #19 NEW cov: 11664 ft: 13572 corp: 14/48b lim: 10 exec/s: 0 rss: 69Mb L: 2/10 MS: 1 ShuffleBytes- 00:07:58.307 [2024-07-13 10:39:14.487408] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:58.307 [2024-07-13 10:39:14.487433] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.307 #20 NEW cov: 11664 ft: 13637 corp: 15/50b lim: 10 exec/s: 0 rss: 69Mb L: 2/10 MS: 1 CopyPart- 00:07:58.307 [2024-07-13 10:39:14.517754] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000200 cdw11:00000000 00:07:58.307 [2024-07-13 10:39:14.517778] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.307 [2024-07-13 10:39:14.517831] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:58.307 [2024-07-13 10:39:14.517845] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.307 [2024-07-13 10:39:14.517894] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:58.307 [2024-07-13 10:39:14.517908] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.307 #21 NEW cov: 11664 ft: 13679 corp: 16/57b lim: 10 exec/s: 0 rss: 69Mb L: 7/10 MS: 1 InsertRepeatedBytes- 00:07:58.307 [2024-07-13 10:39:14.557848] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000a6ae cdw11:00000000 00:07:58.307 [2024-07-13 10:39:14.557876] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.307 [2024-07-13 10:39:14.557929] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000aeae cdw11:00000000 00:07:58.307 [2024-07-13 10:39:14.557943] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.307 [2024-07-13 10:39:14.557992] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ae02 cdw11:00000000 00:07:58.307 [2024-07-13 10:39:14.558022] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.307 #22 NEW cov: 11664 ft: 13701 corp: 17/64b lim: 10 exec/s: 0 rss: 69Mb L: 7/10 MS: 1 ChangeBit- 00:07:58.307 [2024-07-13 10:39:14.597790] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:000031ff cdw11:00000000 00:07:58.307 [2024-07-13 10:39:14.597814] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.307 NEW_FUNC[1/1]: 0x197bcf0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:58.307 #26 NEW cov: 11687 ft: 13750 corp: 18/66b lim: 10 exec/s: 0 rss: 69Mb L: 2/10 MS: 4 EraseBytes-ChangeBinInt-ChangeByte-InsertByte- 00:07:58.307 [2024-07-13 10:39:14.637848] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000010a cdw11:00000000 00:07:58.307 [2024-07-13 10:39:14.637873] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.307 #27 NEW cov: 11687 ft: 13764 corp: 19/68b lim: 10 exec/s: 0 rss: 69Mb L: 2/10 MS: 1 ChangeByte- 00:07:58.307 [2024-07-13 10:39:14.667951] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00008a31 cdw11:00000000 00:07:58.307 [2024-07-13 10:39:14.667976] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.307 #29 NEW cov: 11687 ft: 13843 corp: 20/70b lim: 10 exec/s: 0 rss: 69Mb L: 2/10 MS: 2 ChangeBit-InsertByte- 00:07:58.568 [2024-07-13 10:39:14.698503] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:58.568 [2024-07-13 10:39:14.698529] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.568 [2024-07-13 10:39:14.698583] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000a531 cdw11:00000000 00:07:58.568 [2024-07-13 10:39:14.698596] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.568 [2024-07-13 10:39:14.698647] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000faad cdw11:00000000 00:07:58.568 [2024-07-13 10:39:14.698660] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.568 [2024-07-13 10:39:14.698710] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000b168 cdw11:00000000 00:07:58.568 [2024-07-13 10:39:14.698723] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.568 [2024-07-13 10:39:14.698773] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:00002900 cdw11:00000000 00:07:58.568 [2024-07-13 10:39:14.698786] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:58.568 #30 NEW cov: 11687 ft: 13871 corp: 21/80b lim: 10 exec/s: 30 rss: 69Mb L: 10/10 MS: 1 CMP- DE: "\2451\372\255\261h)\000"- 00:07:58.568 [2024-07-13 10:39:14.738496] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000a6ae cdw11:00000000 00:07:58.568 [2024-07-13 10:39:14.738521] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.568 [2024-07-13 10:39:14.738576] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ffae cdw11:00000000 00:07:58.568 [2024-07-13 10:39:14.738589] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.568 [2024-07-13 10:39:14.738641] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000aeae cdw11:00000000 00:07:58.568 [2024-07-13 10:39:14.738654] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.568 [2024-07-13 10:39:14.738703] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000020a cdw11:00000000 00:07:58.568 [2024-07-13 10:39:14.738716] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.568 #31 NEW cov: 11687 ft: 13899 corp: 22/88b lim: 10 exec/s: 31 rss: 70Mb L: 8/10 MS: 1 InsertByte- 00:07:58.568 [2024-07-13 10:39:14.778283] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000dbdb cdw11:00000000 00:07:58.568 [2024-07-13 10:39:14.778308] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.568 #32 NEW cov: 11687 ft: 13914 corp: 23/91b lim: 10 exec/s: 32 rss: 70Mb L: 3/10 MS: 1 CopyPart- 00:07:58.568 [2024-07-13 10:39:14.818721] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000a6ae cdw11:00000000 00:07:58.568 [2024-07-13 10:39:14.818745] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.568 [2024-07-13 10:39:14.818796] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ffae cdw11:00000000 00:07:58.568 [2024-07-13 10:39:14.818810] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.568 [2024-07-13 10:39:14.818860] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000aeae cdw11:00000000 00:07:58.568 [2024-07-13 10:39:14.818890] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.568 [2024-07-13 10:39:14.818939] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000020a cdw11:00000000 00:07:58.568 [2024-07-13 10:39:14.818952] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.568 #33 NEW cov: 11687 ft: 13917 corp: 24/100b lim: 10 exec/s: 33 rss: 70Mb L: 9/10 MS: 1 InsertByte- 00:07:58.568 [2024-07-13 10:39:14.858851] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000a531 cdw11:00000000 00:07:58.568 [2024-07-13 10:39:14.858876] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.568 [2024-07-13 10:39:14.858928] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000faad cdw11:00000000 00:07:58.568 [2024-07-13 10:39:14.858941] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.568 [2024-07-13 10:39:14.858991] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000b168 cdw11:00000000 00:07:58.568 [2024-07-13 10:39:14.859003] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.568 [2024-07-13 10:39:14.859054] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00002900 cdw11:00000000 00:07:58.568 [2024-07-13 10:39:14.859066] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.568 #35 NEW cov: 11687 ft: 13930 corp: 25/109b lim: 10 exec/s: 35 rss: 70Mb L: 9/10 MS: 2 EraseBytes-PersAutoDict- DE: "\2451\372\255\261h)\000"- 00:07:58.568 [2024-07-13 10:39:14.899085] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000aa5 cdw11:00000000 00:07:58.568 [2024-07-13 10:39:14.899110] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.568 [2024-07-13 10:39:14.899163] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:000031fa cdw11:00000000 00:07:58.568 [2024-07-13 10:39:14.899176] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.568 [2024-07-13 10:39:14.899228] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000adb1 cdw11:00000000 00:07:58.568 [2024-07-13 10:39:14.899241] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.568 [2024-07-13 10:39:14.899290] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00006829 cdw11:00000000 00:07:58.568 [2024-07-13 10:39:14.899304] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.568 [2024-07-13 10:39:14.899355] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:0000000a cdw11:00000000 00:07:58.568 [2024-07-13 10:39:14.899368] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:58.568 #36 NEW cov: 11687 ft: 13936 corp: 26/119b lim: 10 exec/s: 36 rss: 70Mb L: 10/10 MS: 1 PersAutoDict- DE: "\2451\372\255\261h)\000"- 00:07:58.568 [2024-07-13 10:39:14.938711] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a01 cdw11:00000000 00:07:58.568 [2024-07-13 10:39:14.938735] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.828 #37 NEW cov: 11687 ft: 14042 corp: 27/122b lim: 10 exec/s: 37 rss: 70Mb L: 3/10 MS: 1 CrossOver- 00:07:58.828 [2024-07-13 10:39:14.969023] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000200 cdw11:00000000 00:07:58.828 [2024-07-13 10:39:14.969049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.828 [2024-07-13 10:39:14.969117] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000040 cdw11:00000000 00:07:58.828 [2024-07-13 10:39:14.969131] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.828 [2024-07-13 10:39:14.969185] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:58.828 [2024-07-13 10:39:14.969199] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.828 #38 NEW cov: 11687 ft: 14056 corp: 28/129b lim: 10 exec/s: 38 rss: 70Mb L: 7/10 MS: 1 ChangeBit- 00:07:58.828 [2024-07-13 10:39:15.008926] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000db7a cdw11:00000000 00:07:58.828 [2024-07-13 10:39:15.008950] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.828 #39 NEW cov: 11687 ft: 14147 corp: 29/132b lim: 10 exec/s: 39 rss: 70Mb L: 3/10 MS: 1 InsertByte- 00:07:58.828 [2024-07-13 10:39:15.049383] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a29 cdw11:00000000 00:07:58.828 [2024-07-13 10:39:15.049408] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.828 [2024-07-13 10:39:15.049463] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00002929 cdw11:00000000 00:07:58.828 [2024-07-13 10:39:15.049480] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.828 [2024-07-13 10:39:15.049531] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00002929 cdw11:00000000 00:07:58.829 [2024-07-13 10:39:15.049544] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.829 [2024-07-13 10:39:15.049595] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000290a cdw11:00000000 00:07:58.829 [2024-07-13 10:39:15.049608] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.829 #40 NEW cov: 11687 ft: 14180 corp: 30/140b lim: 10 exec/s: 40 rss: 70Mb L: 8/10 MS: 1 InsertRepeatedBytes- 00:07:58.829 [2024-07-13 10:39:15.089372] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000200 cdw11:00000000 00:07:58.829 [2024-07-13 10:39:15.089397] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.829 [2024-07-13 10:39:15.089454] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:58.829 [2024-07-13 10:39:15.089468] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.829 [2024-07-13 10:39:15.089536] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:58.829 [2024-07-13 10:39:15.089550] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.829 #41 NEW cov: 11687 ft: 14204 corp: 31/147b lim: 10 exec/s: 41 rss: 70Mb L: 7/10 MS: 1 ChangeBinInt- 00:07:58.829 [2024-07-13 10:39:15.129471] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000200 cdw11:00000000 00:07:58.829 [2024-07-13 10:39:15.129495] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.829 [2024-07-13 10:39:15.129547] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:58.829 [2024-07-13 10:39:15.129561] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.829 [2024-07-13 10:39:15.129611] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:58.829 [2024-07-13 10:39:15.129625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.829 #42 NEW cov: 11687 ft: 14239 corp: 32/154b lim: 10 exec/s: 42 rss: 70Mb L: 7/10 MS: 1 ShuffleBytes- 00:07:58.829 [2024-07-13 10:39:15.169517] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a97 cdw11:00000000 00:07:58.829 [2024-07-13 10:39:15.169541] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.829 [2024-07-13 10:39:15.169593] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000010a cdw11:00000000 00:07:58.829 [2024-07-13 10:39:15.169607] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.829 #43 NEW cov: 11687 ft: 14395 corp: 33/158b lim: 10 exec/s: 43 rss: 70Mb L: 4/10 MS: 1 InsertByte- 00:07:58.829 [2024-07-13 10:39:15.209853] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:58.829 [2024-07-13 10:39:15.209879] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.829 [2024-07-13 10:39:15.209932] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:58.829 [2024-07-13 10:39:15.209950] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.829 [2024-07-13 10:39:15.210001] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000040 cdw11:00000000 00:07:58.829 [2024-07-13 10:39:15.210015] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.829 [2024-07-13 10:39:15.210065] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000000a cdw11:00000000 00:07:58.829 [2024-07-13 10:39:15.210083] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.088 #44 NEW cov: 11687 ft: 14412 corp: 34/167b lim: 10 exec/s: 44 rss: 70Mb L: 9/10 MS: 1 ChangeBit- 00:07:59.088 [2024-07-13 10:39:15.249950] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:000001ff cdw11:00000000 00:07:59.088 [2024-07-13 10:39:15.249976] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.088 [2024-07-13 10:39:15.250029] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:59.088 [2024-07-13 10:39:15.250043] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.088 [2024-07-13 10:39:15.250096] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:59.088 [2024-07-13 10:39:15.250110] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.088 [2024-07-13 10:39:15.250161] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:59.088 [2024-07-13 10:39:15.250175] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.088 #46 NEW cov: 11687 ft: 14425 corp: 35/175b lim: 10 exec/s: 46 rss: 70Mb L: 8/10 MS: 2 EraseBytes-InsertRepeatedBytes- 00:07:59.088 [2024-07-13 10:39:15.290052] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000200 cdw11:00000000 00:07:59.088 [2024-07-13 10:39:15.290078] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.088 [2024-07-13 10:39:15.290131] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:59.088 [2024-07-13 10:39:15.290144] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.088 [2024-07-13 10:39:15.290198] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000028 cdw11:00000000 00:07:59.088 [2024-07-13 10:39:15.290211] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.088 [2024-07-13 10:39:15.290264] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:000000f6 cdw11:00000000 00:07:59.088 [2024-07-13 10:39:15.290278] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.088 #47 NEW cov: 11687 ft: 14437 corp: 36/183b lim: 10 exec/s: 47 rss: 70Mb L: 8/10 MS: 1 InsertByte- 00:07:59.088 [2024-07-13 10:39:15.329806] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:000027a3 cdw11:00000000 00:07:59.088 [2024-07-13 10:39:15.329832] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.088 #48 NEW cov: 11687 ft: 14443 corp: 37/186b lim: 10 exec/s: 48 rss: 70Mb L: 3/10 MS: 1 ChangeByte- 00:07:59.088 [2024-07-13 10:39:15.370302] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:59.089 [2024-07-13 10:39:15.370331] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.089 [2024-07-13 10:39:15.370384] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:59.089 [2024-07-13 10:39:15.370397] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.089 [2024-07-13 10:39:15.370468] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:59.089 [2024-07-13 10:39:15.370482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.089 [2024-07-13 10:39:15.370531] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000020a cdw11:00000000 00:07:59.089 [2024-07-13 10:39:15.370545] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.089 #49 NEW cov: 11687 ft: 14447 corp: 38/194b lim: 10 exec/s: 49 rss: 70Mb L: 8/10 MS: 1 InsertRepeatedBytes- 00:07:59.089 [2024-07-13 10:39:15.410091] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a01 cdw11:00000000 00:07:59.089 [2024-07-13 10:39:15.410116] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.089 #50 NEW cov: 11687 ft: 14454 corp: 39/197b lim: 10 exec/s: 50 rss: 70Mb L: 3/10 MS: 1 ChangeByte- 00:07:59.089 [2024-07-13 10:39:15.450566] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000a6ae cdw11:00000000 00:07:59.089 [2024-07-13 10:39:15.450591] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.089 [2024-07-13 10:39:15.450644] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ffae cdw11:00000000 00:07:59.089 [2024-07-13 10:39:15.450658] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.089 [2024-07-13 10:39:15.450707] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000aeaa cdw11:00000000 00:07:59.089 [2024-07-13 10:39:15.450721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.089 [2024-07-13 10:39:15.450773] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000020a cdw11:00000000 00:07:59.089 [2024-07-13 10:39:15.450786] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.089 #51 NEW cov: 11687 ft: 14466 corp: 40/205b lim: 10 exec/s: 51 rss: 70Mb L: 8/10 MS: 1 ChangeBit- 00:07:59.349 [2024-07-13 10:39:15.490692] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002d02 cdw11:00000000 00:07:59.349 [2024-07-13 10:39:15.490718] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.349 [2024-07-13 10:39:15.490770] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:59.349 [2024-07-13 10:39:15.490784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.350 [2024-07-13 10:39:15.490834] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00004000 cdw11:00000000 00:07:59.350 [2024-07-13 10:39:15.490846] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.350 [2024-07-13 10:39:15.490896] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000000a cdw11:00000000 00:07:59.350 [2024-07-13 10:39:15.490912] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.350 #52 NEW cov: 11687 ft: 14476 corp: 41/213b lim: 10 exec/s: 52 rss: 70Mb L: 8/10 MS: 1 InsertByte- 00:07:59.350 [2024-07-13 10:39:15.530432] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000240a cdw11:00000000 00:07:59.350 [2024-07-13 10:39:15.530463] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.350 #53 NEW cov: 11687 ft: 14489 corp: 42/215b lim: 10 exec/s: 53 rss: 70Mb L: 2/10 MS: 1 ChangeByte- 00:07:59.350 [2024-07-13 10:39:15.570584] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000200 cdw11:00000000 00:07:59.350 [2024-07-13 10:39:15.570609] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.350 #54 NEW cov: 11687 ft: 14501 corp: 43/217b lim: 10 exec/s: 54 rss: 70Mb L: 2/10 MS: 1 ChangeBinInt- 00:07:59.350 [2024-07-13 10:39:15.611131] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000a531 cdw11:00000000 00:07:59.350 [2024-07-13 10:39:15.611156] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.350 [2024-07-13 10:39:15.611207] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000faad cdw11:00000000 00:07:59.350 [2024-07-13 10:39:15.611221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.350 [2024-07-13 10:39:15.611271] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000b168 cdw11:00000000 00:07:59.350 [2024-07-13 10:39:15.611284] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.350 [2024-07-13 10:39:15.611334] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00002900 cdw11:00000000 00:07:59.350 [2024-07-13 10:39:15.611348] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.350 [2024-07-13 10:39:15.611400] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:0000240a cdw11:00000000 00:07:59.350 [2024-07-13 10:39:15.611412] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:59.350 #55 NEW cov: 11687 ft: 14540 corp: 44/227b lim: 10 exec/s: 55 rss: 70Mb L: 10/10 MS: 1 PersAutoDict- DE: "\2451\372\255\261h)\000"- 00:07:59.350 [2024-07-13 10:39:15.650789] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a01 cdw11:00000000 00:07:59.350 [2024-07-13 10:39:15.650814] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.350 #56 NEW cov: 11687 ft: 14586 corp: 45/230b lim: 10 exec/s: 56 rss: 70Mb L: 3/10 MS: 1 CopyPart- 00:07:59.350 [2024-07-13 10:39:15.680886] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000a500 cdw11:00000000 00:07:59.350 [2024-07-13 10:39:15.680911] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.350 #58 NEW cov: 11687 ft: 14598 corp: 46/232b lim: 10 exec/s: 29 rss: 70Mb L: 2/10 MS: 2 EraseBytes-InsertByte- 00:07:59.350 #58 DONE cov: 11687 ft: 14598 corp: 46/232b lim: 10 exec/s: 29 rss: 70Mb 00:07:59.350 ###### Recommended dictionary. ###### 00:07:59.350 "\2451\372\255\261h)\000" # Uses: 3 00:07:59.350 ###### End of recommended dictionary. ###### 00:07:59.350 Done 58 runs in 2 second(s) 00:07:59.608 10:39:15 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_7.conf 00:07:59.608 10:39:15 -- ../common.sh@72 -- # (( i++ )) 00:07:59.608 10:39:15 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:59.608 10:39:15 -- ../common.sh@73 -- # start_llvm_fuzz 8 1 0x1 00:07:59.608 10:39:15 -- nvmf/run.sh@23 -- # local fuzzer_type=8 00:07:59.608 10:39:15 -- nvmf/run.sh@24 -- # local timen=1 00:07:59.608 10:39:15 -- nvmf/run.sh@25 -- # local core=0x1 00:07:59.608 10:39:15 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:07:59.608 10:39:15 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_8.conf 00:07:59.608 10:39:15 -- nvmf/run.sh@29 -- # printf %02d 8 00:07:59.608 10:39:15 -- nvmf/run.sh@29 -- # port=4408 00:07:59.608 10:39:15 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:07:59.608 10:39:15 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4408' 00:07:59.608 10:39:15 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4408"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:59.608 10:39:15 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4408' -c /tmp/fuzz_json_8.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 -Z 8 -r /var/tmp/spdk8.sock 00:07:59.608 [2024-07-13 10:39:15.859294] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:07:59.608 [2024-07-13 10:39:15.859372] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1993019 ] 00:07:59.608 EAL: No free 2048 kB hugepages reported on node 1 00:07:59.867 [2024-07-13 10:39:16.036051] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:59.867 [2024-07-13 10:39:16.055558] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:59.867 [2024-07-13 10:39:16.055680] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:59.867 [2024-07-13 10:39:16.107057] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:59.867 [2024-07-13 10:39:16.123357] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4408 *** 00:07:59.867 INFO: Running with entropic power schedule (0xFF, 100). 00:07:59.867 INFO: Seed: 971408927 00:07:59.867 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x26ac74c, 0x26ffca9), 00:07:59.867 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x26ffcb0,0x2c35280), 00:07:59.867 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:07:59.867 INFO: A corpus is not provided, starting from an empty corpus 00:07:59.867 [2024-07-13 10:39:16.168582] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.867 [2024-07-13 10:39:16.168611] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.867 #2 INITED cov: 11479 ft: 11475 corp: 1/1b exec/s: 0 rss: 66Mb 00:07:59.867 [2024-07-13 10:39:16.199184] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.867 [2024-07-13 10:39:16.199208] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.867 [2024-07-13 10:39:16.199264] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.867 [2024-07-13 10:39:16.199278] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.867 [2024-07-13 10:39:16.199331] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.867 [2024-07-13 10:39:16.199344] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.867 [2024-07-13 10:39:16.199396] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.867 [2024-07-13 10:39:16.199412] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.867 [2024-07-13 10:39:16.199467] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.867 [2024-07-13 10:39:16.199480] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:00.126 NEW_FUNC[1/1]: 0x1730f60 in nvme_get_transport /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_transport.c:55 00:08:00.126 #3 NEW cov: 11601 ft: 12734 corp: 2/6b lim: 5 exec/s: 0 rss: 68Mb L: 5/5 MS: 1 InsertRepeatedBytes- 00:08:00.385 [2024-07-13 10:39:16.521203] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.385 [2024-07-13 10:39:16.521253] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.385 [2024-07-13 10:39:16.521407] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.385 [2024-07-13 10:39:16.521429] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.385 [2024-07-13 10:39:16.521569] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.385 [2024-07-13 10:39:16.521590] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.385 [2024-07-13 10:39:16.521731] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.385 [2024-07-13 10:39:16.521754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:00.385 #4 NEW cov: 11607 ft: 13285 corp: 3/10b lim: 5 exec/s: 0 rss: 68Mb L: 4/5 MS: 1 EraseBytes- 00:08:00.385 [2024-07-13 10:39:16.580448] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.385 [2024-07-13 10:39:16.580480] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.385 #5 NEW cov: 11692 ft: 13678 corp: 4/11b lim: 5 exec/s: 0 rss: 68Mb L: 1/5 MS: 1 CopyPart- 00:08:00.385 [2024-07-13 10:39:16.630546] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.385 [2024-07-13 10:39:16.630574] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.385 #6 NEW cov: 11692 ft: 13809 corp: 5/12b lim: 5 exec/s: 0 rss: 68Mb L: 1/5 MS: 1 ChangeByte- 00:08:00.385 [2024-07-13 10:39:16.680674] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.385 [2024-07-13 10:39:16.680706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.385 #7 NEW cov: 11692 ft: 13866 corp: 6/13b lim: 5 exec/s: 0 rss: 68Mb L: 1/5 MS: 1 CopyPart- 00:08:00.385 [2024-07-13 10:39:16.731181] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.385 [2024-07-13 10:39:16.731209] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.385 [2024-07-13 10:39:16.731341] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.385 [2024-07-13 10:39:16.731361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.385 #8 NEW cov: 11692 ft: 14132 corp: 7/15b lim: 5 exec/s: 0 rss: 68Mb L: 2/5 MS: 1 InsertByte- 00:08:00.643 [2024-07-13 10:39:16.791348] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.644 [2024-07-13 10:39:16.791376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.644 [2024-07-13 10:39:16.791507] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.644 [2024-07-13 10:39:16.791524] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.644 #9 NEW cov: 11692 ft: 14146 corp: 8/17b lim: 5 exec/s: 0 rss: 68Mb L: 2/5 MS: 1 InsertByte- 00:08:00.644 [2024-07-13 10:39:16.841258] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.644 [2024-07-13 10:39:16.841287] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.644 #10 NEW cov: 11692 ft: 14190 corp: 9/18b lim: 5 exec/s: 0 rss: 68Mb L: 1/5 MS: 1 ChangeBit- 00:08:00.644 [2024-07-13 10:39:16.901751] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.644 [2024-07-13 10:39:16.901778] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.644 [2024-07-13 10:39:16.901901] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.644 [2024-07-13 10:39:16.901918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.644 #11 NEW cov: 11692 ft: 14405 corp: 10/20b lim: 5 exec/s: 0 rss: 68Mb L: 2/5 MS: 1 ShuffleBytes- 00:08:00.644 [2024-07-13 10:39:16.951634] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.644 [2024-07-13 10:39:16.951662] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.644 #12 NEW cov: 11692 ft: 14434 corp: 11/21b lim: 5 exec/s: 0 rss: 69Mb L: 1/5 MS: 1 ShuffleBytes- 00:08:00.644 [2024-07-13 10:39:16.991584] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.644 [2024-07-13 10:39:16.991612] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.644 #13 NEW cov: 11692 ft: 14461 corp: 12/22b lim: 5 exec/s: 0 rss: 69Mb L: 1/5 MS: 1 CrossOver- 00:08:00.902 [2024-07-13 10:39:17.041953] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.902 [2024-07-13 10:39:17.041981] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.902 NEW_FUNC[1/1]: 0x197bcf0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:00.902 #14 NEW cov: 11715 ft: 14469 corp: 13/23b lim: 5 exec/s: 0 rss: 69Mb L: 1/5 MS: 1 ShuffleBytes- 00:08:00.902 [2024-07-13 10:39:17.092599] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.902 [2024-07-13 10:39:17.092628] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.902 [2024-07-13 10:39:17.092769] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.902 [2024-07-13 10:39:17.092787] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.902 [2024-07-13 10:39:17.092906] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.902 [2024-07-13 10:39:17.092924] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.902 #15 NEW cov: 11715 ft: 14668 corp: 14/26b lim: 5 exec/s: 0 rss: 69Mb L: 3/5 MS: 1 CrossOver- 00:08:00.902 [2024-07-13 10:39:17.142027] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.902 [2024-07-13 10:39:17.142055] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.903 #16 NEW cov: 11715 ft: 14706 corp: 15/27b lim: 5 exec/s: 16 rss: 69Mb L: 1/5 MS: 1 ShuffleBytes- 00:08:00.903 [2024-07-13 10:39:17.182370] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.903 [2024-07-13 10:39:17.182399] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.903 [2024-07-13 10:39:17.182524] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.903 [2024-07-13 10:39:17.182542] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.903 #17 NEW cov: 11715 ft: 14759 corp: 16/29b lim: 5 exec/s: 17 rss: 69Mb L: 2/5 MS: 1 InsertByte- 00:08:00.903 [2024-07-13 10:39:17.222713] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.903 [2024-07-13 10:39:17.222740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.903 [2024-07-13 10:39:17.222878] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.903 [2024-07-13 10:39:17.222893] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.903 #18 NEW cov: 11715 ft: 14803 corp: 17/31b lim: 5 exec/s: 18 rss: 69Mb L: 2/5 MS: 1 CrossOver- 00:08:00.903 [2024-07-13 10:39:17.272846] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.903 [2024-07-13 10:39:17.272872] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.903 [2024-07-13 10:39:17.272989] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.903 [2024-07-13 10:39:17.273005] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.162 #19 NEW cov: 11715 ft: 14828 corp: 18/33b lim: 5 exec/s: 19 rss: 69Mb L: 2/5 MS: 1 CopyPart- 00:08:01.162 [2024-07-13 10:39:17.312589] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.162 [2024-07-13 10:39:17.312615] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.162 [2024-07-13 10:39:17.312749] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.162 [2024-07-13 10:39:17.312777] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.162 #20 NEW cov: 11715 ft: 14860 corp: 19/35b lim: 5 exec/s: 20 rss: 69Mb L: 2/5 MS: 1 CopyPart- 00:08:01.162 [2024-07-13 10:39:17.363614] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.162 [2024-07-13 10:39:17.363641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.162 [2024-07-13 10:39:17.363771] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.162 [2024-07-13 10:39:17.363787] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.162 [2024-07-13 10:39:17.363907] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.162 [2024-07-13 10:39:17.363923] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.162 [2024-07-13 10:39:17.364046] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.162 [2024-07-13 10:39:17.364062] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.162 #21 NEW cov: 11715 ft: 14898 corp: 20/39b lim: 5 exec/s: 21 rss: 69Mb L: 4/5 MS: 1 ChangeByte- 00:08:01.162 [2024-07-13 10:39:17.413867] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.162 [2024-07-13 10:39:17.413893] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.162 [2024-07-13 10:39:17.414019] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.162 [2024-07-13 10:39:17.414036] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.162 [2024-07-13 10:39:17.414159] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.162 [2024-07-13 10:39:17.414176] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.162 [2024-07-13 10:39:17.414301] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.162 [2024-07-13 10:39:17.414317] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.162 #22 NEW cov: 11715 ft: 14914 corp: 21/43b lim: 5 exec/s: 22 rss: 69Mb L: 4/5 MS: 1 CrossOver- 00:08:01.162 [2024-07-13 10:39:17.473483] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.162 [2024-07-13 10:39:17.473508] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.162 [2024-07-13 10:39:17.473634] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.162 [2024-07-13 10:39:17.473652] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.162 #23 NEW cov: 11715 ft: 14964 corp: 22/45b lim: 5 exec/s: 23 rss: 69Mb L: 2/5 MS: 1 ChangeByte- 00:08:01.162 [2024-07-13 10:39:17.524436] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.162 [2024-07-13 10:39:17.524464] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.162 [2024-07-13 10:39:17.524591] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.162 [2024-07-13 10:39:17.524607] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.162 [2024-07-13 10:39:17.524740] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.162 [2024-07-13 10:39:17.524757] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.162 [2024-07-13 10:39:17.524880] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.162 [2024-07-13 10:39:17.524896] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.162 [2024-07-13 10:39:17.525029] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.162 [2024-07-13 10:39:17.525045] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:01.162 #24 NEW cov: 11715 ft: 14984 corp: 23/50b lim: 5 exec/s: 24 rss: 69Mb L: 5/5 MS: 1 ChangeBit- 00:08:01.421 [2024-07-13 10:39:17.573479] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.421 [2024-07-13 10:39:17.573505] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.421 #25 NEW cov: 11715 ft: 14993 corp: 24/51b lim: 5 exec/s: 25 rss: 69Mb L: 1/5 MS: 1 CrossOver- 00:08:01.421 [2024-07-13 10:39:17.623636] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.421 [2024-07-13 10:39:17.623664] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.421 #26 NEW cov: 11715 ft: 15002 corp: 25/52b lim: 5 exec/s: 26 rss: 70Mb L: 1/5 MS: 1 ChangeBit- 00:08:01.421 [2024-07-13 10:39:17.674081] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.421 [2024-07-13 10:39:17.674107] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.421 [2024-07-13 10:39:17.674228] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.422 [2024-07-13 10:39:17.674244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.422 #27 NEW cov: 11715 ft: 15040 corp: 26/54b lim: 5 exec/s: 27 rss: 70Mb L: 2/5 MS: 1 CrossOver- 00:08:01.422 [2024-07-13 10:39:17.723931] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.422 [2024-07-13 10:39:17.723958] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.422 #28 NEW cov: 11715 ft: 15047 corp: 27/55b lim: 5 exec/s: 28 rss: 70Mb L: 1/5 MS: 1 CopyPart- 00:08:01.422 [2024-07-13 10:39:17.774452] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.422 [2024-07-13 10:39:17.774478] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.422 [2024-07-13 10:39:17.774609] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.422 [2024-07-13 10:39:17.774624] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.422 #29 NEW cov: 11715 ft: 15079 corp: 28/57b lim: 5 exec/s: 29 rss: 70Mb L: 2/5 MS: 1 InsertByte- 00:08:01.706 [2024-07-13 10:39:17.824836] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.706 [2024-07-13 10:39:17.824863] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.706 [2024-07-13 10:39:17.824996] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.706 [2024-07-13 10:39:17.825014] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.706 [2024-07-13 10:39:17.825138] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.707 [2024-07-13 10:39:17.825154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.707 #30 NEW cov: 11715 ft: 15106 corp: 29/60b lim: 5 exec/s: 30 rss: 70Mb L: 3/5 MS: 1 CrossOver- 00:08:01.707 [2024-07-13 10:39:17.874519] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.707 [2024-07-13 10:39:17.874546] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.707 #31 NEW cov: 11715 ft: 15119 corp: 30/61b lim: 5 exec/s: 31 rss: 70Mb L: 1/5 MS: 1 EraseBytes- 00:08:01.707 [2024-07-13 10:39:17.925749] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.707 [2024-07-13 10:39:17.925776] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.707 [2024-07-13 10:39:17.925914] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.707 [2024-07-13 10:39:17.925932] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.707 [2024-07-13 10:39:17.926063] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.707 [2024-07-13 10:39:17.926080] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.707 [2024-07-13 10:39:17.926209] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.707 [2024-07-13 10:39:17.926228] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.707 [2024-07-13 10:39:17.926361] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.707 [2024-07-13 10:39:17.926381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:01.707 #32 NEW cov: 11715 ft: 15131 corp: 31/66b lim: 5 exec/s: 32 rss: 70Mb L: 5/5 MS: 1 ShuffleBytes- 00:08:01.707 [2024-07-13 10:39:17.974932] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.707 [2024-07-13 10:39:17.974958] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.707 [2024-07-13 10:39:17.975092] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.707 [2024-07-13 10:39:17.975108] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.707 #33 NEW cov: 11715 ft: 15134 corp: 32/68b lim: 5 exec/s: 33 rss: 70Mb L: 2/5 MS: 1 InsertByte- 00:08:01.707 [2024-07-13 10:39:18.024498] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.707 [2024-07-13 10:39:18.024526] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.707 #34 NEW cov: 11715 ft: 15141 corp: 33/69b lim: 5 exec/s: 34 rss: 70Mb L: 1/5 MS: 1 ShuffleBytes- 00:08:01.707 [2024-07-13 10:39:18.075024] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.707 [2024-07-13 10:39:18.075051] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.966 #35 NEW cov: 11715 ft: 15168 corp: 34/70b lim: 5 exec/s: 35 rss: 70Mb L: 1/5 MS: 1 ChangeBit- 00:08:01.966 [2024-07-13 10:39:18.125232] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.966 [2024-07-13 10:39:18.125260] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.966 #36 NEW cov: 11715 ft: 15186 corp: 35/71b lim: 5 exec/s: 36 rss: 70Mb L: 1/5 MS: 1 ChangeBit- 00:08:01.966 [2024-07-13 10:39:18.176341] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.966 [2024-07-13 10:39:18.176367] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.966 [2024-07-13 10:39:18.176503] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.966 [2024-07-13 10:39:18.176521] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.966 [2024-07-13 10:39:18.176644] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.966 [2024-07-13 10:39:18.176661] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.966 [2024-07-13 10:39:18.176785] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.966 [2024-07-13 10:39:18.176800] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.966 [2024-07-13 10:39:18.176915] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.966 [2024-07-13 10:39:18.176935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:01.966 #37 NEW cov: 11715 ft: 15191 corp: 36/76b lim: 5 exec/s: 18 rss: 70Mb L: 5/5 MS: 1 ShuffleBytes- 00:08:01.966 #37 DONE cov: 11715 ft: 15191 corp: 36/76b lim: 5 exec/s: 18 rss: 70Mb 00:08:01.966 Done 37 runs in 2 second(s) 00:08:01.966 10:39:18 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_8.conf 00:08:01.966 10:39:18 -- ../common.sh@72 -- # (( i++ )) 00:08:01.966 10:39:18 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:01.966 10:39:18 -- ../common.sh@73 -- # start_llvm_fuzz 9 1 0x1 00:08:01.966 10:39:18 -- nvmf/run.sh@23 -- # local fuzzer_type=9 00:08:01.966 10:39:18 -- nvmf/run.sh@24 -- # local timen=1 00:08:01.966 10:39:18 -- nvmf/run.sh@25 -- # local core=0x1 00:08:01.966 10:39:18 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:08:01.966 10:39:18 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_9.conf 00:08:01.966 10:39:18 -- nvmf/run.sh@29 -- # printf %02d 9 00:08:01.966 10:39:18 -- nvmf/run.sh@29 -- # port=4409 00:08:01.966 10:39:18 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:08:01.966 10:39:18 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4409' 00:08:01.966 10:39:18 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4409"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:01.966 10:39:18 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4409' -c /tmp/fuzz_json_9.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 -Z 9 -r /var/tmp/spdk9.sock 00:08:02.225 [2024-07-13 10:39:18.360947] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:08:02.225 [2024-07-13 10:39:18.361016] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1993370 ] 00:08:02.225 EAL: No free 2048 kB hugepages reported on node 1 00:08:02.225 [2024-07-13 10:39:18.541464] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:02.225 [2024-07-13 10:39:18.561692] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:02.225 [2024-07-13 10:39:18.561832] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:02.484 [2024-07-13 10:39:18.613277] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:02.484 [2024-07-13 10:39:18.629557] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4409 *** 00:08:02.484 INFO: Running with entropic power schedule (0xFF, 100). 00:08:02.484 INFO: Seed: 3477392159 00:08:02.484 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x26ac74c, 0x26ffca9), 00:08:02.484 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x26ffcb0,0x2c35280), 00:08:02.484 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:08:02.484 INFO: A corpus is not provided, starting from an empty corpus 00:08:02.484 [2024-07-13 10:39:18.695590] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.484 [2024-07-13 10:39:18.695625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.484 #2 INITED cov: 11488 ft: 11489 corp: 1/1b exec/s: 0 rss: 67Mb 00:08:02.484 [2024-07-13 10:39:18.735822] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.484 [2024-07-13 10:39:18.735851] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.484 [2024-07-13 10:39:18.735983] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.484 [2024-07-13 10:39:18.736002] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.484 #3 NEW cov: 11601 ft: 12715 corp: 2/3b lim: 5 exec/s: 0 rss: 67Mb L: 2/2 MS: 1 InsertByte- 00:08:02.484 [2024-07-13 10:39:18.785690] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.484 [2024-07-13 10:39:18.785717] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.484 #4 NEW cov: 11607 ft: 12921 corp: 3/4b lim: 5 exec/s: 0 rss: 67Mb L: 1/2 MS: 1 ChangeByte- 00:08:02.484 [2024-07-13 10:39:18.826406] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.484 [2024-07-13 10:39:18.826432] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.484 [2024-07-13 10:39:18.826559] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.484 [2024-07-13 10:39:18.826575] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.484 [2024-07-13 10:39:18.826700] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.484 [2024-07-13 10:39:18.826715] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.484 #5 NEW cov: 11692 ft: 13414 corp: 4/7b lim: 5 exec/s: 0 rss: 67Mb L: 3/3 MS: 1 CrossOver- 00:08:02.742 [2024-07-13 10:39:18.876235] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.742 [2024-07-13 10:39:18.876264] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.742 [2024-07-13 10:39:18.876390] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.742 [2024-07-13 10:39:18.876408] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.742 #6 NEW cov: 11692 ft: 13538 corp: 5/9b lim: 5 exec/s: 0 rss: 67Mb L: 2/3 MS: 1 ChangeBit- 00:08:02.742 [2024-07-13 10:39:18.916662] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.742 [2024-07-13 10:39:18.916689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.742 [2024-07-13 10:39:18.916809] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.742 [2024-07-13 10:39:18.916827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.742 [2024-07-13 10:39:18.916943] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.742 [2024-07-13 10:39:18.916959] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.742 #7 NEW cov: 11692 ft: 13629 corp: 6/12b lim: 5 exec/s: 0 rss: 68Mb L: 3/3 MS: 1 ChangeBit- 00:08:02.742 [2024-07-13 10:39:18.966610] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.742 [2024-07-13 10:39:18.966650] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.742 [2024-07-13 10:39:18.966789] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.742 [2024-07-13 10:39:18.966807] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.742 [2024-07-13 10:39:18.966928] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.742 [2024-07-13 10:39:18.966944] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.742 #8 NEW cov: 11692 ft: 13714 corp: 7/15b lim: 5 exec/s: 0 rss: 68Mb L: 3/3 MS: 1 ChangeByte- 00:08:02.742 [2024-07-13 10:39:19.016937] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.742 [2024-07-13 10:39:19.016964] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.742 [2024-07-13 10:39:19.017087] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.742 [2024-07-13 10:39:19.017103] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.742 [2024-07-13 10:39:19.017227] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.742 [2024-07-13 10:39:19.017245] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.742 #9 NEW cov: 11692 ft: 13747 corp: 8/18b lim: 5 exec/s: 0 rss: 68Mb L: 3/3 MS: 1 ChangeByte- 00:08:02.742 [2024-07-13 10:39:19.056899] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.742 [2024-07-13 10:39:19.056927] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.742 [2024-07-13 10:39:19.057039] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.742 [2024-07-13 10:39:19.057057] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.742 [2024-07-13 10:39:19.057164] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.742 [2024-07-13 10:39:19.057179] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.742 #10 NEW cov: 11692 ft: 13849 corp: 9/21b lim: 5 exec/s: 0 rss: 68Mb L: 3/3 MS: 1 ChangeBit- 00:08:02.742 [2024-07-13 10:39:19.096567] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.742 [2024-07-13 10:39:19.096592] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.742 #11 NEW cov: 11692 ft: 13931 corp: 10/22b lim: 5 exec/s: 0 rss: 68Mb L: 1/3 MS: 1 ShuffleBytes- 00:08:03.000 [2024-07-13 10:39:19.137154] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.000 [2024-07-13 10:39:19.137182] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.000 [2024-07-13 10:39:19.137312] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.000 [2024-07-13 10:39:19.137332] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.000 [2024-07-13 10:39:19.137459] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.000 [2024-07-13 10:39:19.137476] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.000 #12 NEW cov: 11692 ft: 14059 corp: 11/25b lim: 5 exec/s: 0 rss: 68Mb L: 3/3 MS: 1 CrossOver- 00:08:03.000 [2024-07-13 10:39:19.177411] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.000 [2024-07-13 10:39:19.177438] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.000 [2024-07-13 10:39:19.177570] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.000 [2024-07-13 10:39:19.177587] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.000 [2024-07-13 10:39:19.177710] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.000 [2024-07-13 10:39:19.177727] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.000 #13 NEW cov: 11692 ft: 14136 corp: 12/28b lim: 5 exec/s: 0 rss: 68Mb L: 3/3 MS: 1 ChangeBit- 00:08:03.000 [2024-07-13 10:39:19.227234] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.000 [2024-07-13 10:39:19.227261] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.000 [2024-07-13 10:39:19.227400] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.000 [2024-07-13 10:39:19.227417] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.000 #14 NEW cov: 11692 ft: 14213 corp: 13/30b lim: 5 exec/s: 0 rss: 68Mb L: 2/3 MS: 1 ChangeByte- 00:08:03.000 [2024-07-13 10:39:19.267694] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.000 [2024-07-13 10:39:19.267722] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.000 [2024-07-13 10:39:19.267849] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.000 [2024-07-13 10:39:19.267866] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.000 [2024-07-13 10:39:19.267989] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.001 [2024-07-13 10:39:19.268007] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.001 #15 NEW cov: 11692 ft: 14250 corp: 14/33b lim: 5 exec/s: 0 rss: 68Mb L: 3/3 MS: 1 ChangeBinInt- 00:08:03.001 [2024-07-13 10:39:19.318254] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.001 [2024-07-13 10:39:19.318281] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.001 [2024-07-13 10:39:19.318405] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.001 [2024-07-13 10:39:19.318420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.001 [2024-07-13 10:39:19.318558] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.001 [2024-07-13 10:39:19.318576] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.001 [2024-07-13 10:39:19.318687] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.001 [2024-07-13 10:39:19.318702] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:03.001 [2024-07-13 10:39:19.318823] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.001 [2024-07-13 10:39:19.318840] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:03.001 #16 NEW cov: 11692 ft: 14623 corp: 15/38b lim: 5 exec/s: 0 rss: 68Mb L: 5/5 MS: 1 CrossOver- 00:08:03.001 [2024-07-13 10:39:19.357493] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.001 [2024-07-13 10:39:19.357520] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.001 [2024-07-13 10:39:19.357636] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.001 [2024-07-13 10:39:19.357654] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.001 #17 NEW cov: 11692 ft: 14661 corp: 16/40b lim: 5 exec/s: 0 rss: 69Mb L: 2/5 MS: 1 InsertByte- 00:08:03.259 [2024-07-13 10:39:19.408610] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.259 [2024-07-13 10:39:19.408635] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.259 [2024-07-13 10:39:19.408747] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.259 [2024-07-13 10:39:19.408765] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.259 [2024-07-13 10:39:19.408876] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.259 [2024-07-13 10:39:19.408891] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.259 [2024-07-13 10:39:19.409009] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.259 [2024-07-13 10:39:19.409026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:03.259 [2024-07-13 10:39:19.409142] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.259 [2024-07-13 10:39:19.409158] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:03.259 #18 NEW cov: 11692 ft: 14727 corp: 17/45b lim: 5 exec/s: 0 rss: 69Mb L: 5/5 MS: 1 ChangeByte- 00:08:03.259 [2024-07-13 10:39:19.458175] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.259 [2024-07-13 10:39:19.458203] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.259 [2024-07-13 10:39:19.458326] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.259 [2024-07-13 10:39:19.458342] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.259 [2024-07-13 10:39:19.458463] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.259 [2024-07-13 10:39:19.458480] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.259 #19 NEW cov: 11692 ft: 14743 corp: 18/48b lim: 5 exec/s: 0 rss: 69Mb L: 3/5 MS: 1 CopyPart- 00:08:03.259 [2024-07-13 10:39:19.508088] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.259 [2024-07-13 10:39:19.508116] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.259 [2024-07-13 10:39:19.508234] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.259 [2024-07-13 10:39:19.508252] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.259 #20 NEW cov: 11692 ft: 14752 corp: 19/50b lim: 5 exec/s: 0 rss: 69Mb L: 2/5 MS: 1 ChangeByte- 00:08:03.259 [2024-07-13 10:39:19.548484] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.259 [2024-07-13 10:39:19.548511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.259 [2024-07-13 10:39:19.548637] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.259 [2024-07-13 10:39:19.548653] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.259 [2024-07-13 10:39:19.548779] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.259 [2024-07-13 10:39:19.548796] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.517 NEW_FUNC[1/1]: 0x197bcf0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:03.517 #21 NEW cov: 11715 ft: 14754 corp: 20/53b lim: 5 exec/s: 21 rss: 70Mb L: 3/5 MS: 1 ChangeByte- 00:08:03.517 [2024-07-13 10:39:19.880236] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.517 [2024-07-13 10:39:19.880274] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.518 [2024-07-13 10:39:19.880415] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.518 [2024-07-13 10:39:19.880433] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.518 [2024-07-13 10:39:19.880567] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.518 [2024-07-13 10:39:19.880585] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.518 [2024-07-13 10:39:19.880722] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.518 [2024-07-13 10:39:19.880743] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:03.518 [2024-07-13 10:39:19.880877] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.518 [2024-07-13 10:39:19.880894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:03.777 #22 NEW cov: 11715 ft: 14828 corp: 21/58b lim: 5 exec/s: 22 rss: 70Mb L: 5/5 MS: 1 ChangeBit- 00:08:03.777 [2024-07-13 10:39:19.940495] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.777 [2024-07-13 10:39:19.940545] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.777 [2024-07-13 10:39:19.940674] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.777 [2024-07-13 10:39:19.940695] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.777 [2024-07-13 10:39:19.940825] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.777 [2024-07-13 10:39:19.940843] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.777 [2024-07-13 10:39:19.940976] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.777 [2024-07-13 10:39:19.940992] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:03.777 [2024-07-13 10:39:19.941125] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.777 [2024-07-13 10:39:19.941142] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:03.777 #23 NEW cov: 11715 ft: 14885 corp: 22/63b lim: 5 exec/s: 23 rss: 70Mb L: 5/5 MS: 1 CopyPart- 00:08:03.777 [2024-07-13 10:39:20.000041] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.777 [2024-07-13 10:39:20.000071] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.777 [2024-07-13 10:39:20.000205] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.777 [2024-07-13 10:39:20.000224] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.777 [2024-07-13 10:39:20.000365] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.777 [2024-07-13 10:39:20.000382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.777 #24 NEW cov: 11715 ft: 14897 corp: 23/66b lim: 5 exec/s: 24 rss: 70Mb L: 3/5 MS: 1 ShuffleBytes- 00:08:03.777 [2024-07-13 10:39:20.060227] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.777 [2024-07-13 10:39:20.060259] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.777 [2024-07-13 10:39:20.060395] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.777 [2024-07-13 10:39:20.060415] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.777 [2024-07-13 10:39:20.060550] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.777 [2024-07-13 10:39:20.060567] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.777 #25 NEW cov: 11715 ft: 14910 corp: 24/69b lim: 5 exec/s: 25 rss: 70Mb L: 3/5 MS: 1 EraseBytes- 00:08:03.777 [2024-07-13 10:39:20.110098] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.777 [2024-07-13 10:39:20.110129] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.777 [2024-07-13 10:39:20.110268] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.777 [2024-07-13 10:39:20.110284] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.777 #26 NEW cov: 11715 ft: 14939 corp: 25/71b lim: 5 exec/s: 26 rss: 70Mb L: 2/5 MS: 1 InsertByte- 00:08:03.777 [2024-07-13 10:39:20.160029] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.777 [2024-07-13 10:39:20.160058] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.035 #27 NEW cov: 11715 ft: 14952 corp: 26/72b lim: 5 exec/s: 27 rss: 70Mb L: 1/5 MS: 1 EraseBytes- 00:08:04.035 [2024-07-13 10:39:20.210412] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.035 [2024-07-13 10:39:20.210445] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.035 [2024-07-13 10:39:20.210583] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.035 [2024-07-13 10:39:20.210601] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.035 #28 NEW cov: 11715 ft: 14958 corp: 27/74b lim: 5 exec/s: 28 rss: 70Mb L: 2/5 MS: 1 ChangeBinInt- 00:08:04.035 [2024-07-13 10:39:20.260818] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.035 [2024-07-13 10:39:20.260846] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.035 [2024-07-13 10:39:20.260975] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.035 [2024-07-13 10:39:20.260993] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.035 [2024-07-13 10:39:20.261125] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.035 [2024-07-13 10:39:20.261147] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.035 #29 NEW cov: 11715 ft: 14966 corp: 28/77b lim: 5 exec/s: 29 rss: 70Mb L: 3/5 MS: 1 ChangeBit- 00:08:04.035 [2024-07-13 10:39:20.320995] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.035 [2024-07-13 10:39:20.321023] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.035 [2024-07-13 10:39:20.321168] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.035 [2024-07-13 10:39:20.321186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.035 [2024-07-13 10:39:20.321324] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.035 [2024-07-13 10:39:20.321341] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.035 #30 NEW cov: 11715 ft: 14999 corp: 29/80b lim: 5 exec/s: 30 rss: 70Mb L: 3/5 MS: 1 ShuffleBytes- 00:08:04.035 [2024-07-13 10:39:20.371304] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.035 [2024-07-13 10:39:20.371331] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.035 [2024-07-13 10:39:20.371467] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.035 [2024-07-13 10:39:20.371486] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.035 [2024-07-13 10:39:20.371659] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.035 [2024-07-13 10:39:20.371677] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.035 #31 NEW cov: 11715 ft: 15001 corp: 30/83b lim: 5 exec/s: 31 rss: 70Mb L: 3/5 MS: 1 CopyPart- 00:08:04.293 [2024-07-13 10:39:20.431191] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.293 [2024-07-13 10:39:20.431219] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.293 [2024-07-13 10:39:20.431355] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.293 [2024-07-13 10:39:20.431373] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.293 #32 NEW cov: 11715 ft: 15012 corp: 31/85b lim: 5 exec/s: 32 rss: 70Mb L: 2/5 MS: 1 ChangeBinInt- 00:08:04.293 [2024-07-13 10:39:20.481362] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.293 [2024-07-13 10:39:20.481391] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.293 [2024-07-13 10:39:20.481521] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.293 [2024-07-13 10:39:20.481539] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.293 #33 NEW cov: 11715 ft: 15013 corp: 32/87b lim: 5 exec/s: 33 rss: 70Mb L: 2/5 MS: 1 ChangeByte- 00:08:04.293 [2024-07-13 10:39:20.541822] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.293 [2024-07-13 10:39:20.541851] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.293 [2024-07-13 10:39:20.541999] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.293 [2024-07-13 10:39:20.542017] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.293 [2024-07-13 10:39:20.542153] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.293 [2024-07-13 10:39:20.542171] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.293 #34 NEW cov: 11715 ft: 15069 corp: 33/90b lim: 5 exec/s: 34 rss: 70Mb L: 3/5 MS: 1 ChangeByte- 00:08:04.293 [2024-07-13 10:39:20.601924] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.293 [2024-07-13 10:39:20.601952] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.293 [2024-07-13 10:39:20.602075] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.293 [2024-07-13 10:39:20.602093] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.293 [2024-07-13 10:39:20.602226] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.293 [2024-07-13 10:39:20.602244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.293 #35 NEW cov: 11715 ft: 15085 corp: 34/93b lim: 5 exec/s: 35 rss: 70Mb L: 3/5 MS: 1 InsertByte- 00:08:04.293 [2024-07-13 10:39:20.662885] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.293 [2024-07-13 10:39:20.662914] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.293 [2024-07-13 10:39:20.663046] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.293 [2024-07-13 10:39:20.663066] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.293 [2024-07-13 10:39:20.663195] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.293 [2024-07-13 10:39:20.663212] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.293 [2024-07-13 10:39:20.663349] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.293 [2024-07-13 10:39:20.663368] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:04.293 [2024-07-13 10:39:20.663503] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.293 [2024-07-13 10:39:20.663525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:04.553 #36 NEW cov: 11715 ft: 15099 corp: 35/98b lim: 5 exec/s: 18 rss: 70Mb L: 5/5 MS: 1 InsertRepeatedBytes- 00:08:04.553 #36 DONE cov: 11715 ft: 15099 corp: 35/98b lim: 5 exec/s: 18 rss: 70Mb 00:08:04.553 Done 36 runs in 2 second(s) 00:08:04.553 10:39:20 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_9.conf 00:08:04.553 10:39:20 -- ../common.sh@72 -- # (( i++ )) 00:08:04.553 10:39:20 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:04.553 10:39:20 -- ../common.sh@73 -- # start_llvm_fuzz 10 1 0x1 00:08:04.553 10:39:20 -- nvmf/run.sh@23 -- # local fuzzer_type=10 00:08:04.553 10:39:20 -- nvmf/run.sh@24 -- # local timen=1 00:08:04.553 10:39:20 -- nvmf/run.sh@25 -- # local core=0x1 00:08:04.553 10:39:20 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:08:04.553 10:39:20 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_10.conf 00:08:04.553 10:39:20 -- nvmf/run.sh@29 -- # printf %02d 10 00:08:04.553 10:39:20 -- nvmf/run.sh@29 -- # port=4410 00:08:04.553 10:39:20 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:08:04.553 10:39:20 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4410' 00:08:04.553 10:39:20 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4410"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:04.553 10:39:20 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4410' -c /tmp/fuzz_json_10.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 -Z 10 -r /var/tmp/spdk10.sock 00:08:04.553 [2024-07-13 10:39:20.840183] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:08:04.553 [2024-07-13 10:39:20.840254] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1993852 ] 00:08:04.553 EAL: No free 2048 kB hugepages reported on node 1 00:08:04.812 [2024-07-13 10:39:21.013879] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:04.812 [2024-07-13 10:39:21.033006] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:04.812 [2024-07-13 10:39:21.033126] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:04.812 [2024-07-13 10:39:21.084572] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:04.812 [2024-07-13 10:39:21.100837] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4410 *** 00:08:04.812 INFO: Running with entropic power schedule (0xFF, 100). 00:08:04.812 INFO: Seed: 1655420796 00:08:04.812 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x26ac74c, 0x26ffca9), 00:08:04.812 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x26ffcb0,0x2c35280), 00:08:04.812 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:08:04.812 INFO: A corpus is not provided, starting from an empty corpus 00:08:04.812 #2 INITED exec/s: 0 rss: 60Mb 00:08:04.812 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:04.812 This may also happen if the target rejected all inputs we tried so far 00:08:04.812 [2024-07-13 10:39:21.156255] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a5a5a5a cdw11:5a5a5a5a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.812 [2024-07-13 10:39:21.156284] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.812 [2024-07-13 10:39:21.156348] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:5a5a5a5a cdw11:5a5a5a5a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.813 [2024-07-13 10:39:21.156363] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.072 NEW_FUNC[1/669]: 0x4ab4e0 in fuzz_admin_security_receive_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:205 00:08:05.072 NEW_FUNC[2/669]: 0x4dac50 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:05.072 #30 NEW cov: 11504 ft: 11495 corp: 2/19b lim: 40 exec/s: 0 rss: 67Mb L: 18/18 MS: 3 CopyPart-EraseBytes-InsertRepeatedBytes- 00:08:05.072 [2024-07-13 10:39:21.456975] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a5a5a5a cdw11:5a5a5a5a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.072 [2024-07-13 10:39:21.457007] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.072 [2024-07-13 10:39:21.457068] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:5a5a5a5a cdw11:5a5a5a5a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.072 [2024-07-13 10:39:21.457083] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.332 NEW_FUNC[1/1]: 0x16dd7d0 in _nvme_qpair_complete_abort_queued_reqs /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_qpair.c:593 00:08:05.332 #36 NEW cov: 11624 ft: 11926 corp: 3/37b lim: 40 exec/s: 0 rss: 68Mb L: 18/18 MS: 1 ShuffleBytes- 00:08:05.332 [2024-07-13 10:39:21.507143] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a5a5a5a cdw11:5a5a5a5a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.332 [2024-07-13 10:39:21.507169] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.332 [2024-07-13 10:39:21.507228] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:5a5a5a5a cdw11:5a5a5a5a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.332 [2024-07-13 10:39:21.507242] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.332 [2024-07-13 10:39:21.507298] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:5a5a5a5a cdw11:5a5a5a5a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.332 [2024-07-13 10:39:21.507311] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.332 #42 NEW cov: 11630 ft: 12329 corp: 4/65b lim: 40 exec/s: 0 rss: 68Mb L: 28/28 MS: 1 CopyPart- 00:08:05.332 [2024-07-13 10:39:21.547109] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a5a5a5a cdw11:5a5a5ada SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.332 [2024-07-13 10:39:21.547135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.332 [2024-07-13 10:39:21.547195] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:5a5a5a5a cdw11:5a5a5a5a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.332 [2024-07-13 10:39:21.547209] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.332 #48 NEW cov: 11715 ft: 12690 corp: 5/83b lim: 40 exec/s: 0 rss: 68Mb L: 18/28 MS: 1 ChangeBit- 00:08:05.332 [2024-07-13 10:39:21.587251] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a5a5a5a cdw11:5a1a5ada SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.332 [2024-07-13 10:39:21.587276] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.332 [2024-07-13 10:39:21.587333] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:5a5a5a5a cdw11:5a5a5a5a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.332 [2024-07-13 10:39:21.587347] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.332 #54 NEW cov: 11715 ft: 12764 corp: 6/101b lim: 40 exec/s: 0 rss: 68Mb L: 18/28 MS: 1 ChangeBit- 00:08:05.332 [2024-07-13 10:39:21.627354] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a5a5a5a cdw11:da5a5a5a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.332 [2024-07-13 10:39:21.627380] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.332 [2024-07-13 10:39:21.627438] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:5a5a5a5a cdw11:5a5a5a5a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.332 [2024-07-13 10:39:21.627456] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.332 #55 NEW cov: 11715 ft: 12871 corp: 7/119b lim: 40 exec/s: 0 rss: 68Mb L: 18/28 MS: 1 ChangeBit- 00:08:05.332 [2024-07-13 10:39:21.667524] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a5a5a5b cdw11:5a5a5ada SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.332 [2024-07-13 10:39:21.667549] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.332 [2024-07-13 10:39:21.667626] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:5a5a5a5a cdw11:5a5a5a5a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.332 [2024-07-13 10:39:21.667641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.332 #56 NEW cov: 11715 ft: 12934 corp: 8/137b lim: 40 exec/s: 0 rss: 68Mb L: 18/28 MS: 1 ChangeByte- 00:08:05.332 [2024-07-13 10:39:21.707474] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a5a5a5a cdw11:5a5a5a5a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.332 [2024-07-13 10:39:21.707500] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.592 #57 NEW cov: 11715 ft: 13342 corp: 9/151b lim: 40 exec/s: 0 rss: 68Mb L: 14/28 MS: 1 EraseBytes- 00:08:05.592 [2024-07-13 10:39:21.748081] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a5a5a5a cdw11:5a5a5a5a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.592 [2024-07-13 10:39:21.748107] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.592 [2024-07-13 10:39:21.748170] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:5af7f7f7 cdw11:f7f7f7f7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.592 [2024-07-13 10:39:21.748184] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.592 [2024-07-13 10:39:21.748244] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:f7f7f7f7 cdw11:f7f7f7f7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.592 [2024-07-13 10:39:21.748258] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.592 [2024-07-13 10:39:21.748314] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:f7f7f7f7 cdw11:f7f7f75a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.592 [2024-07-13 10:39:21.748327] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.592 [2024-07-13 10:39:21.748388] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:5a5a5a5a cdw11:5a5a5a5a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.592 [2024-07-13 10:39:21.748401] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:05.592 #58 NEW cov: 11715 ft: 13827 corp: 10/191b lim: 40 exec/s: 0 rss: 68Mb L: 40/40 MS: 1 InsertRepeatedBytes- 00:08:05.592 [2024-07-13 10:39:21.787850] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a5a5a5a cdw11:5a5a5ada SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.592 [2024-07-13 10:39:21.787879] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.592 [2024-07-13 10:39:21.787939] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:5a5a5a5a cdw11:5a5ada5a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.592 [2024-07-13 10:39:21.787953] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.592 #59 NEW cov: 11715 ft: 13922 corp: 11/209b lim: 40 exec/s: 0 rss: 68Mb L: 18/40 MS: 1 CrossOver- 00:08:05.592 [2024-07-13 10:39:21.817992] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a5a5a5a cdw11:5a5a5a5a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.592 [2024-07-13 10:39:21.818018] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.592 [2024-07-13 10:39:21.818078] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:5a5a5a5a cdw11:5a5a5a5a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.592 [2024-07-13 10:39:21.818092] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.592 #60 NEW cov: 11715 ft: 13995 corp: 12/228b lim: 40 exec/s: 0 rss: 70Mb L: 19/40 MS: 1 InsertByte- 00:08:05.592 [2024-07-13 10:39:21.848029] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a5a5a5a cdw11:5a5a5a5a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.592 [2024-07-13 10:39:21.848053] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.592 [2024-07-13 10:39:21.848113] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:5a5a5a5a cdw11:5a5a5a5a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.592 [2024-07-13 10:39:21.848126] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.592 #61 NEW cov: 11715 ft: 14060 corp: 13/247b lim: 40 exec/s: 0 rss: 70Mb L: 19/40 MS: 1 ShuffleBytes- 00:08:05.592 [2024-07-13 10:39:21.888138] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:265a5a5a cdw11:da5a5a5a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.592 [2024-07-13 10:39:21.888163] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.592 [2024-07-13 10:39:21.888223] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:5a5a5a5a cdw11:5a5a5a5a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.592 [2024-07-13 10:39:21.888236] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.592 #62 NEW cov: 11715 ft: 14080 corp: 14/265b lim: 40 exec/s: 0 rss: 70Mb L: 18/40 MS: 1 ChangeByte- 00:08:05.592 [2024-07-13 10:39:21.928256] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a5a5a5a cdw11:5a5a5a5a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.592 [2024-07-13 10:39:21.928281] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.592 [2024-07-13 10:39:21.928339] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:5a5a5a5a cdw11:5a5a5a5a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.592 [2024-07-13 10:39:21.928353] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.592 #63 NEW cov: 11715 ft: 14084 corp: 15/288b lim: 40 exec/s: 0 rss: 70Mb L: 23/40 MS: 1 CrossOver- 00:08:05.593 [2024-07-13 10:39:21.968267] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a5a5a5a cdw11:5a5a5a5a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.593 [2024-07-13 10:39:21.968292] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.852 #64 NEW cov: 11715 ft: 14112 corp: 16/303b lim: 40 exec/s: 0 rss: 70Mb L: 15/40 MS: 1 InsertByte- 00:08:05.852 [2024-07-13 10:39:22.008465] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:265a5a5a cdw11:5a5ada5a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.852 [2024-07-13 10:39:22.008490] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.852 [2024-07-13 10:39:22.008552] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:5a5a5a5a cdw11:5a5a5a5a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.852 [2024-07-13 10:39:22.008566] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.852 #65 NEW cov: 11715 ft: 14123 corp: 17/321b lim: 40 exec/s: 0 rss: 70Mb L: 18/40 MS: 1 ShuffleBytes- 00:08:05.852 [2024-07-13 10:39:22.048813] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a5a5a5a cdw11:5a5a5a5a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.852 [2024-07-13 10:39:22.048838] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.852 [2024-07-13 10:39:22.048897] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:5af7f7f7 cdw11:f7f7f7f7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.852 [2024-07-13 10:39:22.048911] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.852 [2024-07-13 10:39:22.048969] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:f7f7f7f7 cdw11:f7f7f7f7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.852 [2024-07-13 10:39:22.048981] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.852 [2024-07-13 10:39:22.049040] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:f7f75a5a cdw11:5a5a5a5a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.852 [2024-07-13 10:39:22.049053] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.852 NEW_FUNC[1/1]: 0x197bcf0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:05.852 #66 NEW cov: 11738 ft: 14189 corp: 18/356b lim: 40 exec/s: 0 rss: 70Mb L: 35/40 MS: 1 EraseBytes- 00:08:05.852 [2024-07-13 10:39:22.088862] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a5a5a5a cdw11:5a5a5a5a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.852 [2024-07-13 10:39:22.088886] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.852 [2024-07-13 10:39:22.088948] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:5a255a5a cdw11:5a5affff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.852 [2024-07-13 10:39:22.088962] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.852 [2024-07-13 10:39:22.089019] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.852 [2024-07-13 10:39:22.089032] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.852 #67 NEW cov: 11738 ft: 14237 corp: 19/382b lim: 40 exec/s: 0 rss: 70Mb L: 26/40 MS: 1 InsertRepeatedBytes- 00:08:05.852 [2024-07-13 10:39:22.128858] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a5a5a5a cdw11:5a5a5a5a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.852 [2024-07-13 10:39:22.128883] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.852 [2024-07-13 10:39:22.128943] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:5a5a535f cdw11:0a020000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.852 [2024-07-13 10:39:22.128957] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.852 #68 NEW cov: 11738 ft: 14255 corp: 20/404b lim: 40 exec/s: 68 rss: 70Mb L: 22/40 MS: 1 CMP- DE: "S_\012\002\000\000\000\000"- 00:08:05.852 [2024-07-13 10:39:22.158933] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a5a5a5a cdw11:5a1a5ada SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.852 [2024-07-13 10:39:22.158958] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.852 [2024-07-13 10:39:22.159015] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:5a5a5a5a cdw11:5a5a5a5a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.853 [2024-07-13 10:39:22.159029] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.853 #69 NEW cov: 11738 ft: 14267 corp: 21/422b lim: 40 exec/s: 69 rss: 70Mb L: 18/40 MS: 1 ShuffleBytes- 00:08:05.853 [2024-07-13 10:39:22.199177] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a5a5a5a cdw11:f7f7f7f7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.853 [2024-07-13 10:39:22.199201] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.853 [2024-07-13 10:39:22.199260] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:f7f7f7f7 cdw11:f7f7f7f7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.853 [2024-07-13 10:39:22.199273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.853 [2024-07-13 10:39:22.199330] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:5a5a5a5a cdw11:5a5a5a5a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.853 [2024-07-13 10:39:22.199343] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.853 #70 NEW cov: 11738 ft: 14298 corp: 22/447b lim: 40 exec/s: 70 rss: 70Mb L: 25/40 MS: 1 EraseBytes- 00:08:06.112 [2024-07-13 10:39:22.239285] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a626262 cdw11:62626262 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.112 [2024-07-13 10:39:22.239311] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.112 [2024-07-13 10:39:22.239375] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:62626262 cdw11:62626262 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.112 [2024-07-13 10:39:22.239405] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.113 [2024-07-13 10:39:22.239467] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:5a5a5a5a cdw11:5a5a5a5a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.113 [2024-07-13 10:39:22.239481] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.113 #71 NEW cov: 11738 ft: 14302 corp: 23/477b lim: 40 exec/s: 71 rss: 70Mb L: 30/40 MS: 1 InsertRepeatedBytes- 00:08:06.113 [2024-07-13 10:39:22.279103] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a5a5a5a cdw11:5a5a5a5a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.113 [2024-07-13 10:39:22.279128] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.113 #72 NEW cov: 11738 ft: 14313 corp: 24/492b lim: 40 exec/s: 72 rss: 70Mb L: 15/40 MS: 1 ChangeBit- 00:08:06.113 [2024-07-13 10:39:22.319193] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a5a5a5a cdw11:5a5a5a5a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.113 [2024-07-13 10:39:22.319219] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.113 #73 NEW cov: 11738 ft: 14322 corp: 25/507b lim: 40 exec/s: 73 rss: 70Mb L: 15/40 MS: 1 ChangeByte- 00:08:06.113 [2024-07-13 10:39:22.349432] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a5a5a5a cdw11:5a5a5a5a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.113 [2024-07-13 10:39:22.349462] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.113 [2024-07-13 10:39:22.349524] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:5a535f0a cdw11:02000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.113 [2024-07-13 10:39:22.349539] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.113 #74 NEW cov: 11738 ft: 14344 corp: 26/526b lim: 40 exec/s: 74 rss: 70Mb L: 19/40 MS: 1 PersAutoDict- DE: "S_\012\002\000\000\000\000"- 00:08:06.113 [2024-07-13 10:39:22.389439] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a5a5a5a cdw11:5a5a5a5a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.113 [2024-07-13 10:39:22.389468] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.113 #75 NEW cov: 11738 ft: 14443 corp: 27/541b lim: 40 exec/s: 75 rss: 70Mb L: 15/40 MS: 1 InsertByte- 00:08:06.113 [2024-07-13 10:39:22.429735] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a255a5a cdw11:5a5a5a5a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.113 [2024-07-13 10:39:22.429760] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.113 [2024-07-13 10:39:22.429817] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:5a5a5a5a cdw11:5a5a5a5a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.113 [2024-07-13 10:39:22.429830] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.113 #76 NEW cov: 11738 ft: 14448 corp: 28/561b lim: 40 exec/s: 76 rss: 70Mb L: 20/40 MS: 1 InsertByte- 00:08:06.113 [2024-07-13 10:39:22.469825] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a5a5a5a cdw11:5a5a5a5a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.113 [2024-07-13 10:39:22.469850] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.113 [2024-07-13 10:39:22.469909] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:5af7f7f7 cdw11:f7f7f7f7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.113 [2024-07-13 10:39:22.469923] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.113 #77 NEW cov: 11738 ft: 14504 corp: 29/581b lim: 40 exec/s: 77 rss: 70Mb L: 20/40 MS: 1 EraseBytes- 00:08:06.373 [2024-07-13 10:39:22.509856] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a5d5a5a cdw11:5a5a5a5a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.373 [2024-07-13 10:39:22.509882] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.373 [2024-07-13 10:39:22.509939] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:5a5a25da cdw11:5a5a5a5a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.373 [2024-07-13 10:39:22.509952] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.373 #78 NEW cov: 11738 ft: 14542 corp: 30/597b lim: 40 exec/s: 78 rss: 70Mb L: 16/40 MS: 1 InsertByte- 00:08:06.373 [2024-07-13 10:39:22.550041] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:265a5a5a cdw11:5a5a535f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.373 [2024-07-13 10:39:22.550066] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.373 [2024-07-13 10:39:22.550125] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:0a020000 cdw11:00005a5a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.373 [2024-07-13 10:39:22.550139] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.373 #79 NEW cov: 11738 ft: 14578 corp: 31/615b lim: 40 exec/s: 79 rss: 70Mb L: 18/40 MS: 1 PersAutoDict- DE: "S_\012\002\000\000\000\000"- 00:08:06.373 [2024-07-13 10:39:22.589997] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a5a5a5a cdw11:0a5a5a5a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.373 [2024-07-13 10:39:22.590022] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.373 #80 NEW cov: 11738 ft: 14583 corp: 32/629b lim: 40 exec/s: 80 rss: 70Mb L: 14/40 MS: 1 CopyPart- 00:08:06.373 [2024-07-13 10:39:22.620200] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a5a5a5a cdw11:5a5a5a5a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.373 [2024-07-13 10:39:22.620225] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.373 [2024-07-13 10:39:22.620280] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:5a5a5a5a cdw11:5a5a5a5a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.373 [2024-07-13 10:39:22.620294] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.373 #81 NEW cov: 11738 ft: 14636 corp: 33/648b lim: 40 exec/s: 81 rss: 70Mb L: 19/40 MS: 1 ChangeByte- 00:08:06.373 [2024-07-13 10:39:22.650212] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a5d5a5a cdw11:5a5a5a5a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.373 [2024-07-13 10:39:22.650237] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.373 #82 NEW cov: 11738 ft: 14703 corp: 34/661b lim: 40 exec/s: 82 rss: 70Mb L: 13/40 MS: 1 CrossOver- 00:08:06.373 [2024-07-13 10:39:22.690439] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a5a5a5a cdw11:f7f7f7f7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.373 [2024-07-13 10:39:22.690468] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.373 [2024-07-13 10:39:22.690524] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:f7f7f7f7 cdw11:f7f7f7f7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.373 [2024-07-13 10:39:22.690537] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.373 #83 NEW cov: 11738 ft: 14713 corp: 35/681b lim: 40 exec/s: 83 rss: 70Mb L: 20/40 MS: 1 EraseBytes- 00:08:06.373 [2024-07-13 10:39:22.730952] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a5a5a5a cdw11:5a5a5a5a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.373 [2024-07-13 10:39:22.730977] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.374 [2024-07-13 10:39:22.731036] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:5af7f7f7 cdw11:f728f7f7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.374 [2024-07-13 10:39:22.731050] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.374 [2024-07-13 10:39:22.731111] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:f7f7f7f7 cdw11:f7f7f7f7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.374 [2024-07-13 10:39:22.731124] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.374 [2024-07-13 10:39:22.731179] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:f7f7f7f7 cdw11:f7f7f75a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.374 [2024-07-13 10:39:22.731192] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:06.374 [2024-07-13 10:39:22.731248] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:5a5a5a5a cdw11:5a5a5a5a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.374 [2024-07-13 10:39:22.731261] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:06.374 #84 NEW cov: 11738 ft: 14719 corp: 36/721b lim: 40 exec/s: 84 rss: 70Mb L: 40/40 MS: 1 ChangeBinInt- 00:08:06.634 [2024-07-13 10:39:22.770689] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a5a0a5a cdw11:5a5a5a5a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.634 [2024-07-13 10:39:22.770715] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.634 [2024-07-13 10:39:22.770774] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:5a5a5ada cdw11:5a5a5a5a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.634 [2024-07-13 10:39:22.770787] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.634 #85 NEW cov: 11738 ft: 14729 corp: 37/741b lim: 40 exec/s: 85 rss: 70Mb L: 20/40 MS: 1 CrossOver- 00:08:06.634 [2024-07-13 10:39:22.810703] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a5a5a5a cdw11:5a5a5a5a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.634 [2024-07-13 10:39:22.810730] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.634 #86 NEW cov: 11738 ft: 14749 corp: 38/749b lim: 40 exec/s: 86 rss: 70Mb L: 8/40 MS: 1 EraseBytes- 00:08:06.634 [2024-07-13 10:39:22.841349] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a5a5a5a cdw11:5a5a5a5a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.634 [2024-07-13 10:39:22.841376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.634 [2024-07-13 10:39:22.841433] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:5af7f7f7 cdw11:fbf7f7f7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.634 [2024-07-13 10:39:22.841451] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.634 [2024-07-13 10:39:22.841512] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:f7f7f7f7 cdw11:f7f7f7f7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.634 [2024-07-13 10:39:22.841525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.634 [2024-07-13 10:39:22.841583] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:f7f7f7f7 cdw11:f7f7f75a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.634 [2024-07-13 10:39:22.841596] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:06.634 [2024-07-13 10:39:22.841653] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:5a5a5a5a cdw11:5a5a5a5a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.634 [2024-07-13 10:39:22.841667] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:06.634 #87 NEW cov: 11738 ft: 14764 corp: 39/789b lim: 40 exec/s: 87 rss: 70Mb L: 40/40 MS: 1 ChangeBinInt- 00:08:06.634 [2024-07-13 10:39:22.881417] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a5a5a5a cdw11:f7f7f7f7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.634 [2024-07-13 10:39:22.881446] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.634 [2024-07-13 10:39:22.881504] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:f7f7f7f7 cdw11:f7f7f7f7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.634 [2024-07-13 10:39:22.881518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.634 [2024-07-13 10:39:22.881577] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:5a5a5a5a cdw11:5affffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.634 [2024-07-13 10:39:22.881591] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.634 [2024-07-13 10:39:22.881650] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.634 [2024-07-13 10:39:22.881663] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:06.634 [2024-07-13 10:39:22.881721] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:ffffffff cdw11:5a5a5a5a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.634 [2024-07-13 10:39:22.881733] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:06.634 #88 NEW cov: 11738 ft: 14796 corp: 40/829b lim: 40 exec/s: 88 rss: 70Mb L: 40/40 MS: 1 InsertRepeatedBytes- 00:08:06.634 [2024-07-13 10:39:22.921072] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a5d5a5a cdw11:5a5a5a5a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.634 [2024-07-13 10:39:22.921097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.634 #89 NEW cov: 11738 ft: 14804 corp: 41/842b lim: 40 exec/s: 89 rss: 71Mb L: 13/40 MS: 1 ChangeBit- 00:08:06.634 [2024-07-13 10:39:22.961695] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a5a5a5a cdw11:5a5a5a5a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.634 [2024-07-13 10:39:22.961720] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.634 [2024-07-13 10:39:22.961797] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:5af7f7f7 cdw11:f7f7f7f7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.634 [2024-07-13 10:39:22.961811] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.634 [2024-07-13 10:39:22.961869] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:f7f7f7f7 cdw11:f7f7f7f7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.634 [2024-07-13 10:39:22.961883] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.634 [2024-07-13 10:39:22.961944] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:f7f7f7f7 cdw11:f7f7f75a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.634 [2024-07-13 10:39:22.961957] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:06.634 [2024-07-13 10:39:22.962018] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:5a5a5a5a cdw11:5a5a5a5a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.634 [2024-07-13 10:39:22.962034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:06.634 [2024-07-13 10:39:22.991797] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a5a5a5a cdw11:5a5a5a5a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.634 [2024-07-13 10:39:22.991821] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.634 [2024-07-13 10:39:22.991881] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:5af7f7f7 cdw11:f7f7f7f7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.635 [2024-07-13 10:39:22.991895] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.635 [2024-07-13 10:39:22.991951] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:f7f7f7f7 cdw11:f7f7f7f7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.635 [2024-07-13 10:39:22.991965] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.635 [2024-07-13 10:39:22.992020] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:f7f7f7f7 cdw11:f75a5a5a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.635 [2024-07-13 10:39:22.992033] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:06.635 [2024-07-13 10:39:22.992089] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:5a5a5a5a cdw11:5a5a5a5a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.635 [2024-07-13 10:39:22.992102] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:06.635 #91 NEW cov: 11738 ft: 14814 corp: 42/882b lim: 40 exec/s: 91 rss: 71Mb L: 40/40 MS: 2 ShuffleBytes-CopyPart- 00:08:06.893 [2024-07-13 10:39:23.031330] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a5d5a5a cdw11:5a5a5a5a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.893 [2024-07-13 10:39:23.031356] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.893 #92 NEW cov: 11738 ft: 14817 corp: 43/895b lim: 40 exec/s: 92 rss: 71Mb L: 13/40 MS: 1 ChangeByte- 00:08:06.893 [2024-07-13 10:39:23.071613] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a5a5a5b cdw11:5a5a5a5a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.893 [2024-07-13 10:39:23.071640] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.893 [2024-07-13 10:39:23.071699] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:5a5a5a5a cdw11:5a5a5a5a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.893 [2024-07-13 10:39:23.071713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.893 #93 NEW cov: 11738 ft: 14821 corp: 44/914b lim: 40 exec/s: 93 rss: 71Mb L: 19/40 MS: 1 ChangeBinInt- 00:08:06.893 [2024-07-13 10:39:23.111604] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a5a5a5a cdw11:0a5a5a5a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.893 [2024-07-13 10:39:23.111629] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.893 [2024-07-13 10:39:23.151860] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a5a5a5a cdw11:0a5a5a5a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.893 [2024-07-13 10:39:23.151885] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.893 [2024-07-13 10:39:23.151945] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:5a5a5a5a cdw11:5a5a0a5a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.893 [2024-07-13 10:39:23.151962] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.893 #95 NEW cov: 11738 ft: 14828 corp: 45/936b lim: 40 exec/s: 47 rss: 71Mb L: 22/40 MS: 2 InsertByte-CopyPart- 00:08:06.893 #95 DONE cov: 11738 ft: 14828 corp: 45/936b lim: 40 exec/s: 47 rss: 71Mb 00:08:06.893 ###### Recommended dictionary. ###### 00:08:06.893 "S_\012\002\000\000\000\000" # Uses: 2 00:08:06.893 ###### End of recommended dictionary. ###### 00:08:06.893 Done 95 runs in 2 second(s) 00:08:06.893 10:39:23 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_10.conf 00:08:07.152 10:39:23 -- ../common.sh@72 -- # (( i++ )) 00:08:07.152 10:39:23 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:07.152 10:39:23 -- ../common.sh@73 -- # start_llvm_fuzz 11 1 0x1 00:08:07.152 10:39:23 -- nvmf/run.sh@23 -- # local fuzzer_type=11 00:08:07.152 10:39:23 -- nvmf/run.sh@24 -- # local timen=1 00:08:07.152 10:39:23 -- nvmf/run.sh@25 -- # local core=0x1 00:08:07.152 10:39:23 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:08:07.152 10:39:23 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_11.conf 00:08:07.152 10:39:23 -- nvmf/run.sh@29 -- # printf %02d 11 00:08:07.152 10:39:23 -- nvmf/run.sh@29 -- # port=4411 00:08:07.152 10:39:23 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:08:07.152 10:39:23 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4411' 00:08:07.152 10:39:23 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4411"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:07.152 10:39:23 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4411' -c /tmp/fuzz_json_11.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 -Z 11 -r /var/tmp/spdk11.sock 00:08:07.152 [2024-07-13 10:39:23.324083] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:08:07.152 [2024-07-13 10:39:23.324150] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1994384 ] 00:08:07.152 EAL: No free 2048 kB hugepages reported on node 1 00:08:07.152 [2024-07-13 10:39:23.505694] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:07.152 [2024-07-13 10:39:23.524908] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:07.152 [2024-07-13 10:39:23.525050] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:07.411 [2024-07-13 10:39:23.576588] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:07.411 [2024-07-13 10:39:23.592888] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4411 *** 00:08:07.411 INFO: Running with entropic power schedule (0xFF, 100). 00:08:07.411 INFO: Seed: 4146422255 00:08:07.411 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x26ac74c, 0x26ffca9), 00:08:07.411 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x26ffcb0,0x2c35280), 00:08:07.411 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:08:07.411 INFO: A corpus is not provided, starting from an empty corpus 00:08:07.411 #2 INITED exec/s: 0 rss: 60Mb 00:08:07.411 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:07.411 This may also happen if the target rejected all inputs we tried so far 00:08:07.411 [2024-07-13 10:39:23.648822] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:39393939 cdw11:39393939 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.411 [2024-07-13 10:39:23.648851] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.411 [2024-07-13 10:39:23.648925] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:39393939 cdw11:39393939 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.411 [2024-07-13 10:39:23.648942] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.411 [2024-07-13 10:39:23.648998] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:39393939 cdw11:39393939 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.411 [2024-07-13 10:39:23.649011] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.411 [2024-07-13 10:39:23.649067] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:39393939 cdw11:39393939 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.411 [2024-07-13 10:39:23.649079] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.670 NEW_FUNC[1/671]: 0x4ad250 in fuzz_admin_security_send_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:223 00:08:07.670 NEW_FUNC[2/671]: 0x4dac50 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:07.670 #4 NEW cov: 11523 ft: 11524 corp: 2/35b lim: 40 exec/s: 0 rss: 67Mb L: 34/34 MS: 2 InsertByte-InsertRepeatedBytes- 00:08:07.670 [2024-07-13 10:39:23.969343] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ff0b0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.670 [2024-07-13 10:39:23.969427] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.670 #8 NEW cov: 11636 ft: 13075 corp: 3/44b lim: 40 exec/s: 0 rss: 68Mb L: 9/34 MS: 4 ChangeBit-InsertRepeatedBytes-ChangeBit-InsertRepeatedBytes- 00:08:07.670 [2024-07-13 10:39:24.019001] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffff6d cdw11:ffff0b00 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.670 [2024-07-13 10:39:24.019027] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.670 #14 NEW cov: 11642 ft: 13323 corp: 4/54b lim: 40 exec/s: 0 rss: 68Mb L: 10/34 MS: 1 InsertByte- 00:08:07.928 [2024-07-13 10:39:24.059650] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:39393939 cdw11:39393939 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.928 [2024-07-13 10:39:24.059677] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.928 [2024-07-13 10:39:24.059737] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:39393939 cdw11:39393939 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.928 [2024-07-13 10:39:24.059751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.928 [2024-07-13 10:39:24.059810] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:39393939 cdw11:39393939 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.928 [2024-07-13 10:39:24.059824] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.928 [2024-07-13 10:39:24.059882] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:39393939 cdw11:39393939 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.928 [2024-07-13 10:39:24.059896] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.928 #15 NEW cov: 11727 ft: 13610 corp: 5/90b lim: 40 exec/s: 0 rss: 68Mb L: 36/36 MS: 1 CopyPart- 00:08:07.928 [2024-07-13 10:39:24.099772] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:39393939 cdw11:22393939 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.928 [2024-07-13 10:39:24.099797] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.928 [2024-07-13 10:39:24.099858] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:39393939 cdw11:39393939 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.928 [2024-07-13 10:39:24.099872] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.928 [2024-07-13 10:39:24.099932] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:39393939 cdw11:39393939 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.928 [2024-07-13 10:39:24.099945] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.928 [2024-07-13 10:39:24.100002] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:39393939 cdw11:39393939 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.928 [2024-07-13 10:39:24.100015] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.928 #16 NEW cov: 11727 ft: 13723 corp: 6/124b lim: 40 exec/s: 0 rss: 68Mb L: 34/36 MS: 1 ChangeBinInt- 00:08:07.928 [2024-07-13 10:39:24.140062] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:39393939 cdw11:39393939 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.928 [2024-07-13 10:39:24.140087] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.928 [2024-07-13 10:39:24.140146] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:39393939 cdw11:39393939 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.929 [2024-07-13 10:39:24.140160] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.929 [2024-07-13 10:39:24.140219] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:39393939 cdw11:39b5b5b5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.929 [2024-07-13 10:39:24.140232] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.929 [2024-07-13 10:39:24.140292] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:b5b5b539 cdw11:39393939 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.929 [2024-07-13 10:39:24.140305] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.929 [2024-07-13 10:39:24.140366] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:8 nsid:0 cdw10:39393939 cdw11:3939210a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.929 [2024-07-13 10:39:24.140380] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:07.929 #17 NEW cov: 11727 ft: 13821 corp: 7/164b lim: 40 exec/s: 0 rss: 68Mb L: 40/40 MS: 1 InsertRepeatedBytes- 00:08:07.929 [2024-07-13 10:39:24.179970] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:39393939 cdw11:39393939 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.929 [2024-07-13 10:39:24.179995] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.929 [2024-07-13 10:39:24.180057] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:39393939 cdw11:39393939 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.929 [2024-07-13 10:39:24.180071] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.929 [2024-07-13 10:39:24.180129] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:39393939 cdw11:39393939 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.929 [2024-07-13 10:39:24.180143] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.929 [2024-07-13 10:39:24.180201] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:39393939 cdw11:39393939 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.929 [2024-07-13 10:39:24.180216] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.929 #18 NEW cov: 11727 ft: 13880 corp: 8/198b lim: 40 exec/s: 0 rss: 68Mb L: 34/40 MS: 1 CrossOver- 00:08:07.929 [2024-07-13 10:39:24.219609] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffff6d cdw11:f8ff0b00 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.929 [2024-07-13 10:39:24.219633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.929 #19 NEW cov: 11727 ft: 13910 corp: 9/208b lim: 40 exec/s: 0 rss: 68Mb L: 10/40 MS: 1 ChangeBinInt- 00:08:07.929 [2024-07-13 10:39:24.260206] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:39393939 cdw11:22393939 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.929 [2024-07-13 10:39:24.260231] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.929 [2024-07-13 10:39:24.260291] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:39393939 cdw11:39393939 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.929 [2024-07-13 10:39:24.260305] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.929 [2024-07-13 10:39:24.260364] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:39393939 cdw11:39393939 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.929 [2024-07-13 10:39:24.260377] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.929 [2024-07-13 10:39:24.260436] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:74393939 cdw11:39393939 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.929 [2024-07-13 10:39:24.260453] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.929 #20 NEW cov: 11727 ft: 13936 corp: 10/242b lim: 40 exec/s: 0 rss: 68Mb L: 34/40 MS: 1 ChangeByte- 00:08:07.929 [2024-07-13 10:39:24.300342] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:39393939 cdw11:39393939 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.929 [2024-07-13 10:39:24.300367] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.929 [2024-07-13 10:39:24.300425] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:39393939 cdw11:39393939 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.929 [2024-07-13 10:39:24.300439] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.929 [2024-07-13 10:39:24.300501] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:39393939 cdw11:39393939 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.929 [2024-07-13 10:39:24.300514] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.929 [2024-07-13 10:39:24.300573] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:39393939 cdw11:39393938 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.929 [2024-07-13 10:39:24.300587] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:08.187 #21 NEW cov: 11727 ft: 13981 corp: 11/276b lim: 40 exec/s: 0 rss: 68Mb L: 34/40 MS: 1 ChangeASCIIInt- 00:08:08.187 [2024-07-13 10:39:24.339988] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:410b0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.187 [2024-07-13 10:39:24.340013] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.187 #22 NEW cov: 11727 ft: 14054 corp: 12/285b lim: 40 exec/s: 0 rss: 69Mb L: 9/40 MS: 1 ChangeByte- 00:08:08.188 [2024-07-13 10:39:24.380607] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:39393939 cdw11:39393939 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.188 [2024-07-13 10:39:24.380633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.188 [2024-07-13 10:39:24.380696] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:39393939 cdw11:39393939 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.188 [2024-07-13 10:39:24.380710] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.188 [2024-07-13 10:39:24.380771] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:39393939 cdw11:39393939 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.188 [2024-07-13 10:39:24.380785] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.188 [2024-07-13 10:39:24.380843] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:39393939 cdw11:39393939 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.188 [2024-07-13 10:39:24.380857] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:08.188 #23 NEW cov: 11727 ft: 14127 corp: 13/324b lim: 40 exec/s: 0 rss: 70Mb L: 39/40 MS: 1 CopyPart- 00:08:08.188 [2024-07-13 10:39:24.420207] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffff6d cdw11:f7ff0b00 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.188 [2024-07-13 10:39:24.420233] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.188 #24 NEW cov: 11727 ft: 14146 corp: 14/334b lim: 40 exec/s: 0 rss: 70Mb L: 10/40 MS: 1 ChangeBinInt- 00:08:08.188 [2024-07-13 10:39:24.460326] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffff6d cdw11:ffff0b00 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.188 [2024-07-13 10:39:24.460351] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.188 #25 NEW cov: 11727 ft: 14237 corp: 15/349b lim: 40 exec/s: 0 rss: 70Mb L: 15/40 MS: 1 InsertRepeatedBytes- 00:08:08.188 [2024-07-13 10:39:24.500482] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffff6d cdw11:ffff0b00 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.188 [2024-07-13 10:39:24.500507] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.188 #26 NEW cov: 11727 ft: 14283 corp: 16/364b lim: 40 exec/s: 0 rss: 70Mb L: 15/40 MS: 1 CrossOver- 00:08:08.188 [2024-07-13 10:39:24.540556] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:fffeff6d cdw11:f8ff0b00 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.188 [2024-07-13 10:39:24.540582] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.188 NEW_FUNC[1/1]: 0x197bcf0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:08.188 #27 NEW cov: 11750 ft: 14325 corp: 17/374b lim: 40 exec/s: 0 rss: 70Mb L: 10/40 MS: 1 ChangeBit- 00:08:08.515 [2024-07-13 10:39:24.580839] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a2d3939 cdw11:39393939 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.516 [2024-07-13 10:39:24.580865] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.516 [2024-07-13 10:39:24.580927] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:39393939 cdw11:39393939 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.516 [2024-07-13 10:39:24.580940] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.516 #30 NEW cov: 11750 ft: 14590 corp: 18/395b lim: 40 exec/s: 0 rss: 70Mb L: 21/40 MS: 3 InsertByte-ChangeBit-CrossOver- 00:08:08.516 [2024-07-13 10:39:24.621298] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:39393904 cdw11:04040439 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.516 [2024-07-13 10:39:24.621324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.516 [2024-07-13 10:39:24.621388] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:22393939 cdw11:39393939 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.516 [2024-07-13 10:39:24.621402] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.516 [2024-07-13 10:39:24.621463] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:39393939 cdw11:39393939 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.516 [2024-07-13 10:39:24.621476] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.516 [2024-07-13 10:39:24.621534] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:39393939 cdw11:74393939 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.516 [2024-07-13 10:39:24.621548] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:08.516 #31 NEW cov: 11750 ft: 14664 corp: 19/433b lim: 40 exec/s: 31 rss: 70Mb L: 38/40 MS: 1 InsertRepeatedBytes- 00:08:08.516 [2024-07-13 10:39:24.661424] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:39393939 cdw11:39393901 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.516 [2024-07-13 10:39:24.661455] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.516 [2024-07-13 10:39:24.661516] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000439 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.516 [2024-07-13 10:39:24.661530] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.516 [2024-07-13 10:39:24.661589] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:39393939 cdw11:39393939 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.516 [2024-07-13 10:39:24.661602] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.516 [2024-07-13 10:39:24.661662] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:39393939 cdw11:39393939 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.516 [2024-07-13 10:39:24.661675] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:08.516 #32 NEW cov: 11750 ft: 14679 corp: 20/469b lim: 40 exec/s: 32 rss: 70Mb L: 36/40 MS: 1 CMP- DE: "\001\000\000\000\000\000\000\004"- 00:08:08.516 [2024-07-13 10:39:24.701222] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffff6d cdw11:ff393939 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.516 [2024-07-13 10:39:24.701248] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.516 [2024-07-13 10:39:24.701306] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:39223939 cdw11:39393939 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.516 [2024-07-13 10:39:24.701319] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.516 #38 NEW cov: 11750 ft: 14694 corp: 21/492b lim: 40 exec/s: 38 rss: 70Mb L: 23/40 MS: 1 CrossOver- 00:08:08.516 [2024-07-13 10:39:24.741139] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ff6d0b00 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.516 [2024-07-13 10:39:24.741168] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.516 #39 NEW cov: 11750 ft: 14719 corp: 22/502b lim: 40 exec/s: 39 rss: 70Mb L: 10/40 MS: 1 CopyPart- 00:08:08.516 [2024-07-13 10:39:24.781287] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffff6d cdw11:f8ff0b00 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.516 [2024-07-13 10:39:24.781313] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.516 #40 NEW cov: 11750 ft: 14723 corp: 23/512b lim: 40 exec/s: 40 rss: 70Mb L: 10/40 MS: 1 ChangeBinInt- 00:08:08.516 [2024-07-13 10:39:24.821852] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:39393939 cdw11:38303901 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.516 [2024-07-13 10:39:24.821877] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.516 [2024-07-13 10:39:24.821954] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000439 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.516 [2024-07-13 10:39:24.821969] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.516 [2024-07-13 10:39:24.822028] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:39393939 cdw11:39393939 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.516 [2024-07-13 10:39:24.822042] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.516 [2024-07-13 10:39:24.822099] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:39393939 cdw11:39393939 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.516 [2024-07-13 10:39:24.822112] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:08.516 #41 NEW cov: 11750 ft: 14737 corp: 24/548b lim: 40 exec/s: 41 rss: 70Mb L: 36/40 MS: 1 ChangeASCIIInt- 00:08:08.800 [2024-07-13 10:39:24.861541] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:fffffbff cdw11:6df7ff0b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.800 [2024-07-13 10:39:24.861568] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.800 #42 NEW cov: 11750 ft: 14742 corp: 25/559b lim: 40 exec/s: 42 rss: 70Mb L: 11/40 MS: 1 InsertByte- 00:08:08.800 [2024-07-13 10:39:24.901816] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:f3393939 cdw11:39393939 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.800 [2024-07-13 10:39:24.901841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.800 [2024-07-13 10:39:24.901898] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:39393939 cdw11:39b5b5b5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.800 [2024-07-13 10:39:24.901912] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.800 #46 NEW cov: 11750 ft: 14824 corp: 26/578b lim: 40 exec/s: 46 rss: 70Mb L: 19/40 MS: 4 CrossOver-CopyPart-ChangeByte-CrossOver- 00:08:08.800 [2024-07-13 10:39:24.942232] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:39393939 cdw11:39393939 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.801 [2024-07-13 10:39:24.942256] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.801 [2024-07-13 10:39:24.942313] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:39393939 cdw11:db393939 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.801 [2024-07-13 10:39:24.942330] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.801 [2024-07-13 10:39:24.942384] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:39393939 cdw11:39393939 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.801 [2024-07-13 10:39:24.942397] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.801 [2024-07-13 10:39:24.942460] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:39393939 cdw11:39393939 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.801 [2024-07-13 10:39:24.942474] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:08.801 #47 NEW cov: 11750 ft: 14837 corp: 27/614b lim: 40 exec/s: 47 rss: 70Mb L: 36/40 MS: 1 ChangeByte- 00:08:08.801 [2024-07-13 10:39:24.981954] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a2d3939 cdw11:39393939 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.801 [2024-07-13 10:39:24.981979] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.801 [2024-07-13 10:39:24.982033] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:39393939 cdw11:39393d39 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.801 [2024-07-13 10:39:24.982047] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.801 #48 NEW cov: 11750 ft: 14856 corp: 28/635b lim: 40 exec/s: 48 rss: 70Mb L: 21/40 MS: 1 ChangeBit- 00:08:08.801 [2024-07-13 10:39:25.022495] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:39393939 cdw11:39393939 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.801 [2024-07-13 10:39:25.022520] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.801 [2024-07-13 10:39:25.022578] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:39393939 cdw11:db393939 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.801 [2024-07-13 10:39:25.022591] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.801 [2024-07-13 10:39:25.022650] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:39240000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.801 [2024-07-13 10:39:25.022662] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.801 [2024-07-13 10:39:25.022722] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:00393939 cdw11:39393939 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.801 [2024-07-13 10:39:25.022735] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:08.801 #49 NEW cov: 11750 ft: 14881 corp: 29/671b lim: 40 exec/s: 49 rss: 70Mb L: 36/40 MS: 1 ChangeBinInt- 00:08:08.801 [2024-07-13 10:39:25.062243] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffff6d cdw11:ff393939 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.801 [2024-07-13 10:39:25.062268] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.801 [2024-07-13 10:39:25.062324] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:39223939 cdw11:908516f4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.801 [2024-07-13 10:39:25.062337] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.801 #50 NEW cov: 11750 ft: 14963 corp: 30/694b lim: 40 exec/s: 50 rss: 70Mb L: 23/40 MS: 1 CMP- DE: "\220\205\026\364\014\177\000\000"- 00:08:08.801 [2024-07-13 10:39:25.102574] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffff6d cdw11:f7ff0b00 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.801 [2024-07-13 10:39:25.102602] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.801 [2024-07-13 10:39:25.102662] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:004a4a4a cdw11:4a4a4a4a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.801 [2024-07-13 10:39:25.102675] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.801 [2024-07-13 10:39:25.102731] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:4a4a4a4a cdw11:4a4a4a4a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.801 [2024-07-13 10:39:25.102744] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.801 #51 NEW cov: 11750 ft: 15159 corp: 31/724b lim: 40 exec/s: 51 rss: 70Mb L: 30/40 MS: 1 InsertRepeatedBytes- 00:08:08.801 [2024-07-13 10:39:25.142289] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffff6d cdw11:ffff0b00 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.801 [2024-07-13 10:39:25.142314] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.801 #52 NEW cov: 11750 ft: 15174 corp: 32/734b lim: 40 exec/s: 52 rss: 70Mb L: 10/40 MS: 1 CrossOver- 00:08:08.801 [2024-07-13 10:39:25.182771] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffff6d cdw11:ff393939 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.801 [2024-07-13 10:39:25.182796] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.801 [2024-07-13 10:39:25.182854] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:39223939 cdw11:39908516 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.801 [2024-07-13 10:39:25.182868] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.801 [2024-07-13 10:39:25.182924] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:f40c7f00 cdw11:00393939 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.801 [2024-07-13 10:39:25.182937] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.061 #53 NEW cov: 11750 ft: 15176 corp: 33/765b lim: 40 exec/s: 53 rss: 70Mb L: 31/40 MS: 1 PersAutoDict- DE: "\220\205\026\364\014\177\000\000"- 00:08:09.061 [2024-07-13 10:39:25.223061] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:39393939 cdw11:39393939 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.061 [2024-07-13 10:39:25.223086] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.061 [2024-07-13 10:39:25.223146] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:39393939 cdw11:39393939 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.061 [2024-07-13 10:39:25.223160] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.061 [2024-07-13 10:39:25.223215] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:39393939 cdw11:39393939 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.061 [2024-07-13 10:39:25.223229] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.061 [2024-07-13 10:39:25.223284] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:39393939 cdw11:39393938 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.061 [2024-07-13 10:39:25.223297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:09.061 #54 NEW cov: 11750 ft: 15180 corp: 34/799b lim: 40 exec/s: 54 rss: 70Mb L: 34/40 MS: 1 ChangeASCIIInt- 00:08:09.061 [2024-07-13 10:39:25.263145] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:39393939 cdw11:38303901 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.061 [2024-07-13 10:39:25.263170] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.061 [2024-07-13 10:39:25.263228] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000439 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.061 [2024-07-13 10:39:25.263241] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.061 [2024-07-13 10:39:25.263300] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:39393939 cdw11:39393939 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.061 [2024-07-13 10:39:25.263314] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.061 [2024-07-13 10:39:25.263372] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:39393939 cdw11:39393938 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.061 [2024-07-13 10:39:25.263385] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:09.061 #55 NEW cov: 11750 ft: 15189 corp: 35/835b lim: 40 exec/s: 55 rss: 70Mb L: 36/40 MS: 1 ChangeBit- 00:08:09.061 [2024-07-13 10:39:25.302959] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffff6d cdw11:ff393939 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.061 [2024-07-13 10:39:25.302983] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.061 [2024-07-13 10:39:25.303041] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:398516f4 cdw11:0c7f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.061 [2024-07-13 10:39:25.303054] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.061 #56 NEW cov: 11750 ft: 15278 corp: 36/854b lim: 40 exec/s: 56 rss: 70Mb L: 19/40 MS: 1 EraseBytes- 00:08:09.061 [2024-07-13 10:39:25.343007] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffff6d cdw11:ff393939 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.061 [2024-07-13 10:39:25.343031] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.061 [2024-07-13 10:39:25.343089] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:392216f4 cdw11:85900c39 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.061 [2024-07-13 10:39:25.343103] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.061 #57 NEW cov: 11750 ft: 15306 corp: 37/877b lim: 40 exec/s: 57 rss: 70Mb L: 23/40 MS: 1 ShuffleBytes- 00:08:09.061 [2024-07-13 10:39:25.382969] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:fffffff8 cdw11:6dff000b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.061 [2024-07-13 10:39:25.382994] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.061 #58 NEW cov: 11750 ft: 15349 corp: 38/887b lim: 40 exec/s: 58 rss: 71Mb L: 10/40 MS: 1 ShuffleBytes- 00:08:09.061 [2024-07-13 10:39:25.423131] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffff6d cdw11:f8000501 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.061 [2024-07-13 10:39:25.423156] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.321 #59 NEW cov: 11750 ft: 15397 corp: 39/895b lim: 40 exec/s: 59 rss: 71Mb L: 8/40 MS: 1 EraseBytes- 00:08:09.321 [2024-07-13 10:39:25.463607] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:39393939 cdw11:39393939 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.321 [2024-07-13 10:39:25.463635] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.321 [2024-07-13 10:39:25.463697] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:39393939 cdw11:39393939 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.321 [2024-07-13 10:39:25.463711] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.321 [2024-07-13 10:39:25.463770] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:39393939 cdw11:39393939 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.321 [2024-07-13 10:39:25.463783] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.321 #60 NEW cov: 11750 ft: 15404 corp: 40/922b lim: 40 exec/s: 60 rss: 71Mb L: 27/40 MS: 1 EraseBytes- 00:08:09.321 [2024-07-13 10:39:25.503790] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:39393939 cdw11:38303901 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.321 [2024-07-13 10:39:25.503814] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.321 [2024-07-13 10:39:25.503875] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000439 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.321 [2024-07-13 10:39:25.503888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.321 [2024-07-13 10:39:25.503949] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:39393939 cdw11:39393939 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.321 [2024-07-13 10:39:25.503962] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.321 [2024-07-13 10:39:25.504023] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:39393939 cdw11:39393939 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.321 [2024-07-13 10:39:25.504036] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:09.321 #61 NEW cov: 11750 ft: 15410 corp: 41/958b lim: 40 exec/s: 61 rss: 71Mb L: 36/40 MS: 1 ChangeByte- 00:08:09.321 [2024-07-13 10:39:25.543454] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ff010b00 cdw11:0b000001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.321 [2024-07-13 10:39:25.543478] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.321 #65 NEW cov: 11750 ft: 15448 corp: 42/966b lim: 40 exec/s: 65 rss: 71Mb L: 8/40 MS: 4 EraseBytes-EraseBytes-CopyPart-InsertByte- 00:08:09.321 [2024-07-13 10:39:25.584086] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:39393939 cdw11:38303901 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.321 [2024-07-13 10:39:25.584111] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.321 [2024-07-13 10:39:25.584171] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000439 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.321 [2024-07-13 10:39:25.584185] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.321 [2024-07-13 10:39:25.584228] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:39393939 cdw11:39393939 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.321 [2024-07-13 10:39:25.584241] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.321 [2024-07-13 10:39:25.584309] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:39393939 cdw11:39393939 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.321 [2024-07-13 10:39:25.584323] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:09.321 #66 NEW cov: 11750 ft: 15459 corp: 43/1002b lim: 40 exec/s: 66 rss: 71Mb L: 36/40 MS: 1 ShuffleBytes- 00:08:09.321 [2024-07-13 10:39:25.624187] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:39393939 cdw11:39393939 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.321 [2024-07-13 10:39:25.624212] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.321 [2024-07-13 10:39:25.624274] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:39393939 cdw11:db393939 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.321 [2024-07-13 10:39:25.624288] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.321 [2024-07-13 10:39:25.624349] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:39240000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.321 [2024-07-13 10:39:25.624362] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.321 [2024-07-13 10:39:25.624424] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:00393939 cdw11:39393939 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.321 [2024-07-13 10:39:25.624437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:09.321 #67 NEW cov: 11750 ft: 15489 corp: 44/1038b lim: 40 exec/s: 33 rss: 71Mb L: 36/40 MS: 1 CopyPart- 00:08:09.321 #67 DONE cov: 11750 ft: 15489 corp: 44/1038b lim: 40 exec/s: 33 rss: 71Mb 00:08:09.321 ###### Recommended dictionary. ###### 00:08:09.321 "\001\000\000\000\000\000\000\004" # Uses: 0 00:08:09.321 "\220\205\026\364\014\177\000\000" # Uses: 1 00:08:09.321 ###### End of recommended dictionary. ###### 00:08:09.321 Done 67 runs in 2 second(s) 00:08:09.581 10:39:25 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_11.conf 00:08:09.581 10:39:25 -- ../common.sh@72 -- # (( i++ )) 00:08:09.581 10:39:25 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:09.581 10:39:25 -- ../common.sh@73 -- # start_llvm_fuzz 12 1 0x1 00:08:09.581 10:39:25 -- nvmf/run.sh@23 -- # local fuzzer_type=12 00:08:09.581 10:39:25 -- nvmf/run.sh@24 -- # local timen=1 00:08:09.581 10:39:25 -- nvmf/run.sh@25 -- # local core=0x1 00:08:09.581 10:39:25 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:08:09.581 10:39:25 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_12.conf 00:08:09.581 10:39:25 -- nvmf/run.sh@29 -- # printf %02d 12 00:08:09.581 10:39:25 -- nvmf/run.sh@29 -- # port=4412 00:08:09.581 10:39:25 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:08:09.581 10:39:25 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4412' 00:08:09.581 10:39:25 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4412"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:09.581 10:39:25 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4412' -c /tmp/fuzz_json_12.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 -Z 12 -r /var/tmp/spdk12.sock 00:08:09.581 [2024-07-13 10:39:25.791368] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:08:09.581 [2024-07-13 10:39:25.791422] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1994687 ] 00:08:09.581 EAL: No free 2048 kB hugepages reported on node 1 00:08:09.581 [2024-07-13 10:39:25.965556] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:09.840 [2024-07-13 10:39:25.985331] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:09.840 [2024-07-13 10:39:25.985484] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:09.840 [2024-07-13 10:39:26.036911] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:09.840 [2024-07-13 10:39:26.053210] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4412 *** 00:08:09.840 INFO: Running with entropic power schedule (0xFF, 100). 00:08:09.840 INFO: Seed: 2312456281 00:08:09.840 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x26ac74c, 0x26ffca9), 00:08:09.840 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x26ffcb0,0x2c35280), 00:08:09.840 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:08:09.840 INFO: A corpus is not provided, starting from an empty corpus 00:08:09.840 #2 INITED exec/s: 0 rss: 60Mb 00:08:09.840 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:09.840 This may also happen if the target rejected all inputs we tried so far 00:08:09.840 [2024-07-13 10:39:26.119189] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.840 [2024-07-13 10:39:26.119227] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.099 NEW_FUNC[1/666]: 0x4aefc0 in fuzz_admin_directive_send_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:241 00:08:10.099 NEW_FUNC[2/666]: 0x4dac50 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:10.099 #3 NEW cov: 11477 ft: 11478 corp: 2/10b lim: 40 exec/s: 0 rss: 67Mb L: 9/9 MS: 1 InsertRepeatedBytes- 00:08:10.099 [2024-07-13 10:39:26.450034] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.099 [2024-07-13 10:39:26.450072] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.099 NEW_FUNC[1/5]: 0x16decf0 in spdk_nvme_qpair_process_completions /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_qpair.c:757 00:08:10.099 NEW_FUNC[2/5]: 0x17402a0 in nvme_transport_qpair_process_completions /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_transport.c:606 00:08:10.099 #4 NEW cov: 11634 ft: 12070 corp: 3/19b lim: 40 exec/s: 0 rss: 68Mb L: 9/9 MS: 1 CMP- DE: "\012\000\000\000"- 00:08:10.359 [2024-07-13 10:39:26.500145] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.359 [2024-07-13 10:39:26.500172] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.359 #5 NEW cov: 11640 ft: 12456 corp: 4/34b lim: 40 exec/s: 0 rss: 68Mb L: 15/15 MS: 1 CopyPart- 00:08:10.359 [2024-07-13 10:39:26.540254] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:0097000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.359 [2024-07-13 10:39:26.540282] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.359 #9 NEW cov: 11725 ft: 12725 corp: 5/42b lim: 40 exec/s: 0 rss: 68Mb L: 8/15 MS: 4 EraseBytes-EraseBytes-CMP-InsertByte- DE: "\000\000\000\000"- 00:08:10.359 [2024-07-13 10:39:26.580142] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00973f0a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.359 [2024-07-13 10:39:26.580171] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.359 #20 NEW cov: 11725 ft: 12784 corp: 6/50b lim: 40 exec/s: 0 rss: 68Mb L: 8/15 MS: 1 ChangeByte- 00:08:10.359 [2024-07-13 10:39:26.620463] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.359 [2024-07-13 10:39:26.620489] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.359 #21 NEW cov: 11725 ft: 12825 corp: 7/65b lim: 40 exec/s: 0 rss: 68Mb L: 15/15 MS: 1 CrossOver- 00:08:10.359 [2024-07-13 10:39:26.660113] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.359 [2024-07-13 10:39:26.660138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.359 #22 NEW cov: 11725 ft: 12962 corp: 8/80b lim: 40 exec/s: 0 rss: 69Mb L: 15/15 MS: 1 ShuffleBytes- 00:08:10.359 [2024-07-13 10:39:26.700749] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.359 [2024-07-13 10:39:26.700776] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.359 #23 NEW cov: 11725 ft: 13010 corp: 9/91b lim: 40 exec/s: 0 rss: 69Mb L: 11/15 MS: 1 InsertRepeatedBytes- 00:08:10.359 [2024-07-13 10:39:26.740861] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:2a2a2a2a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.359 [2024-07-13 10:39:26.740887] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.619 #28 NEW cov: 11725 ft: 13087 corp: 10/100b lim: 40 exec/s: 0 rss: 69Mb L: 9/15 MS: 5 ChangeByte-ShuffleBytes-CopyPart-PersAutoDict-InsertRepeatedBytes- DE: "\000\000\000\000"- 00:08:10.619 [2024-07-13 10:39:26.781039] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.619 [2024-07-13 10:39:26.781066] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.619 #29 NEW cov: 11725 ft: 13128 corp: 11/115b lim: 40 exec/s: 0 rss: 69Mb L: 15/15 MS: 1 ShuffleBytes- 00:08:10.619 [2024-07-13 10:39:26.820694] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:000000ff cdw11:fd000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.619 [2024-07-13 10:39:26.820724] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.619 #30 NEW cov: 11725 ft: 13157 corp: 12/124b lim: 40 exec/s: 0 rss: 69Mb L: 9/15 MS: 1 ChangeBinInt- 00:08:10.619 [2024-07-13 10:39:26.861175] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ff180000 cdw11:00973f0a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.619 [2024-07-13 10:39:26.861205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.619 #31 NEW cov: 11725 ft: 13235 corp: 13/132b lim: 40 exec/s: 0 rss: 69Mb L: 8/15 MS: 1 CMP- DE: "\377\030"- 00:08:10.619 [2024-07-13 10:39:26.901249] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.619 [2024-07-13 10:39:26.901277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.619 #32 NEW cov: 11725 ft: 13274 corp: 14/141b lim: 40 exec/s: 0 rss: 69Mb L: 9/15 MS: 1 PersAutoDict- DE: "\012\000\000\000"- 00:08:10.619 [2024-07-13 10:39:26.941359] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.619 [2024-07-13 10:39:26.941387] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.619 #33 NEW cov: 11725 ft: 13278 corp: 15/152b lim: 40 exec/s: 0 rss: 69Mb L: 11/15 MS: 1 ShuffleBytes- 00:08:10.619 [2024-07-13 10:39:26.992428] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.619 [2024-07-13 10:39:26.992460] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.619 [2024-07-13 10:39:26.992596] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.619 [2024-07-13 10:39:26.992612] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.619 [2024-07-13 10:39:26.992739] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.619 [2024-07-13 10:39:26.992755] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.619 [2024-07-13 10:39:26.992884] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.619 [2024-07-13 10:39:26.992900] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.879 NEW_FUNC[1/1]: 0x197bcf0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:10.879 #34 NEW cov: 11748 ft: 14155 corp: 16/190b lim: 40 exec/s: 0 rss: 69Mb L: 38/38 MS: 1 InsertRepeatedBytes- 00:08:10.879 [2024-07-13 10:39:27.031782] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00090000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.879 [2024-07-13 10:39:27.031810] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.879 [2024-07-13 10:39:27.031940] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:97970000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.879 [2024-07-13 10:39:27.031956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.879 #35 NEW cov: 11748 ft: 14371 corp: 17/207b lim: 40 exec/s: 0 rss: 69Mb L: 17/38 MS: 1 CMP- DE: "\011\000"- 00:08:10.879 [2024-07-13 10:39:27.092569] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.879 [2024-07-13 10:39:27.092598] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.879 [2024-07-13 10:39:27.092737] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00008989 cdw11:89898989 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.879 [2024-07-13 10:39:27.092754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.879 [2024-07-13 10:39:27.092883] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:89898989 cdw11:89898989 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.879 [2024-07-13 10:39:27.092901] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.879 [2024-07-13 10:39:27.093031] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:89898989 cdw11:89898989 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.879 [2024-07-13 10:39:27.093046] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.879 #36 NEW cov: 11748 ft: 14379 corp: 18/245b lim: 40 exec/s: 36 rss: 69Mb L: 38/38 MS: 1 InsertRepeatedBytes- 00:08:10.879 [2024-07-13 10:39:27.141916] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00973f0a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.879 [2024-07-13 10:39:27.141951] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.879 #37 NEW cov: 11748 ft: 14492 corp: 19/253b lim: 40 exec/s: 37 rss: 69Mb L: 8/38 MS: 1 ShuffleBytes- 00:08:10.879 [2024-07-13 10:39:27.182377] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00090000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.879 [2024-07-13 10:39:27.182405] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.879 [2024-07-13 10:39:27.182555] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:97970000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.879 [2024-07-13 10:39:27.182574] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.879 #38 NEW cov: 11748 ft: 14534 corp: 20/270b lim: 40 exec/s: 38 rss: 69Mb L: 17/38 MS: 1 ShuffleBytes- 00:08:10.879 [2024-07-13 10:39:27.233020] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.879 [2024-07-13 10:39:27.233050] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.879 [2024-07-13 10:39:27.233199] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.879 [2024-07-13 10:39:27.233215] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.879 [2024-07-13 10:39:27.233333] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000097 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.879 [2024-07-13 10:39:27.233349] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.879 [2024-07-13 10:39:27.233483] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.879 [2024-07-13 10:39:27.233499] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.879 #39 NEW cov: 11748 ft: 14559 corp: 21/308b lim: 40 exec/s: 39 rss: 69Mb L: 38/38 MS: 1 CrossOver- 00:08:11.139 [2024-07-13 10:39:27.281956] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:2f002700 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.139 [2024-07-13 10:39:27.281984] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.139 #42 NEW cov: 11748 ft: 14568 corp: 22/316b lim: 40 exec/s: 42 rss: 69Mb L: 8/38 MS: 3 EraseBytes-InsertByte-InsertByte- 00:08:11.139 [2024-07-13 10:39:27.322756] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.139 [2024-07-13 10:39:27.322785] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.139 [2024-07-13 10:39:27.322920] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:0000002a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.139 [2024-07-13 10:39:27.322936] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.139 #43 NEW cov: 11748 ft: 14600 corp: 23/337b lim: 40 exec/s: 43 rss: 69Mb L: 21/38 MS: 1 CrossOver- 00:08:11.139 [2024-07-13 10:39:27.373498] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.139 [2024-07-13 10:39:27.373525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.139 [2024-07-13 10:39:27.373658] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.139 [2024-07-13 10:39:27.373674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.139 [2024-07-13 10:39:27.373804] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00008400 cdw11:97000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.139 [2024-07-13 10:39:27.373821] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.139 [2024-07-13 10:39:27.373946] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.139 [2024-07-13 10:39:27.373961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.139 #49 NEW cov: 11748 ft: 14666 corp: 24/376b lim: 40 exec/s: 49 rss: 70Mb L: 39/39 MS: 1 InsertByte- 00:08:11.139 [2024-07-13 10:39:27.422349] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:0000ff18 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.139 [2024-07-13 10:39:27.422377] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.139 #50 NEW cov: 11748 ft: 14681 corp: 25/391b lim: 40 exec/s: 50 rss: 70Mb L: 15/39 MS: 1 PersAutoDict- DE: "\377\030"- 00:08:11.139 [2024-07-13 10:39:27.462804] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.139 [2024-07-13 10:39:27.462833] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.139 [2024-07-13 10:39:27.462961] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:000000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.139 [2024-07-13 10:39:27.462979] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.139 #51 NEW cov: 11748 ft: 14713 corp: 26/414b lim: 40 exec/s: 51 rss: 70Mb L: 23/39 MS: 1 PersAutoDict- DE: "\377\030"- 00:08:11.139 [2024-07-13 10:39:27.523317] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00a00000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.139 [2024-07-13 10:39:27.523345] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.139 [2024-07-13 10:39:27.523475] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:97970000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.139 [2024-07-13 10:39:27.523491] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.398 #52 NEW cov: 11748 ft: 14731 corp: 27/431b lim: 40 exec/s: 52 rss: 70Mb L: 17/39 MS: 1 ChangeByte- 00:08:11.398 [2024-07-13 10:39:27.573215] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00313f0a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.398 [2024-07-13 10:39:27.573243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.398 #53 NEW cov: 11748 ft: 14738 corp: 28/439b lim: 40 exec/s: 53 rss: 70Mb L: 8/39 MS: 1 ChangeByte- 00:08:11.398 [2024-07-13 10:39:27.613574] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.398 [2024-07-13 10:39:27.613603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.398 [2024-07-13 10:39:27.613739] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:ff000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.398 [2024-07-13 10:39:27.613757] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.398 #54 NEW cov: 11748 ft: 14745 corp: 29/458b lim: 40 exec/s: 54 rss: 70Mb L: 19/39 MS: 1 CMP- DE: "\377\377\377\000"- 00:08:11.398 [2024-07-13 10:39:27.653495] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000800 cdw11:00973f0a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.398 [2024-07-13 10:39:27.653524] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.398 #55 NEW cov: 11748 ft: 14756 corp: 30/466b lim: 40 exec/s: 55 rss: 70Mb L: 8/39 MS: 1 ChangeBit- 00:08:11.398 [2024-07-13 10:39:27.704120] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.398 [2024-07-13 10:39:27.704149] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.398 [2024-07-13 10:39:27.704283] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:0000ffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.398 [2024-07-13 10:39:27.704300] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.398 [2024-07-13 10:39:27.704423] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.398 [2024-07-13 10:39:27.704439] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.398 #56 NEW cov: 11748 ft: 14953 corp: 31/497b lim: 40 exec/s: 56 rss: 70Mb L: 31/39 MS: 1 InsertRepeatedBytes- 00:08:11.398 [2024-07-13 10:39:27.753393] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00002500 cdw11:00973f0a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.398 [2024-07-13 10:39:27.753420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.398 #57 NEW cov: 11748 ft: 14958 corp: 32/505b lim: 40 exec/s: 57 rss: 70Mb L: 8/39 MS: 1 ChangeByte- 00:08:11.657 [2024-07-13 10:39:27.804714] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.657 [2024-07-13 10:39:27.804742] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.657 [2024-07-13 10:39:27.804871] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.657 [2024-07-13 10:39:27.804889] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.657 [2024-07-13 10:39:27.805014] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.657 [2024-07-13 10:39:27.805032] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.657 [2024-07-13 10:39:27.805162] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:00000100 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.657 [2024-07-13 10:39:27.805180] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.657 #58 NEW cov: 11748 ft: 14989 corp: 33/543b lim: 40 exec/s: 58 rss: 70Mb L: 38/39 MS: 1 CMP- DE: "\001\000\000\000\000\000\000\001"- 00:08:11.657 [2024-07-13 10:39:27.843696] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:2f000800 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.657 [2024-07-13 10:39:27.843727] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.657 #59 NEW cov: 11748 ft: 14999 corp: 34/551b lim: 40 exec/s: 59 rss: 70Mb L: 8/39 MS: 1 ChangeBinInt- 00:08:11.657 [2024-07-13 10:39:27.883899] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:090000ff cdw11:fd000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.657 [2024-07-13 10:39:27.883926] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.657 #60 NEW cov: 11748 ft: 15002 corp: 35/560b lim: 40 exec/s: 60 rss: 70Mb L: 9/39 MS: 1 PersAutoDict- DE: "\011\000"- 00:08:11.657 [2024-07-13 10:39:27.924346] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:0000ff00 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.657 [2024-07-13 10:39:27.924376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.657 #61 NEW cov: 11748 ft: 15022 corp: 36/570b lim: 40 exec/s: 61 rss: 70Mb L: 10/39 MS: 1 EraseBytes- 00:08:11.657 [2024-07-13 10:39:27.964373] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00070801 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.657 [2024-07-13 10:39:27.964400] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.657 #65 NEW cov: 11748 ft: 15048 corp: 37/585b lim: 40 exec/s: 65 rss: 70Mb L: 15/39 MS: 4 EraseBytes-ChangeBinInt-ChangeBit-PersAutoDict- DE: "\001\000\000\000\000\000\000\001"- 00:08:11.657 [2024-07-13 10:39:28.004649] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:0000ff00 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.657 [2024-07-13 10:39:28.004677] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.657 [2024-07-13 10:39:28.004801] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00270000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.657 [2024-07-13 10:39:28.004818] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.657 #66 NEW cov: 11748 ft: 15053 corp: 38/602b lim: 40 exec/s: 66 rss: 70Mb L: 17/39 MS: 1 CrossOver- 00:08:11.917 [2024-07-13 10:39:28.065530] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.917 [2024-07-13 10:39:28.065573] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.917 [2024-07-13 10:39:28.065705] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00a7a7a7 cdw11:a7a7a700 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.917 [2024-07-13 10:39:28.065723] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.917 [2024-07-13 10:39:28.065850] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.917 [2024-07-13 10:39:28.065868] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.917 [2024-07-13 10:39:28.066001] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.917 [2024-07-13 10:39:28.066016] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.917 #67 NEW cov: 11748 ft: 15110 corp: 39/639b lim: 40 exec/s: 67 rss: 70Mb L: 37/39 MS: 1 InsertRepeatedBytes- 00:08:11.917 [2024-07-13 10:39:28.114377] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0afd0000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.917 [2024-07-13 10:39:28.114404] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.917 #68 NEW cov: 11748 ft: 15119 corp: 40/650b lim: 40 exec/s: 34 rss: 70Mb L: 11/39 MS: 1 CrossOver- 00:08:11.917 #68 DONE cov: 11748 ft: 15119 corp: 40/650b lim: 40 exec/s: 34 rss: 70Mb 00:08:11.917 ###### Recommended dictionary. ###### 00:08:11.917 "\012\000\000\000" # Uses: 2 00:08:11.917 "\000\000\000\000" # Uses: 1 00:08:11.917 "\377\030" # Uses: 3 00:08:11.917 "\011\000" # Uses: 1 00:08:11.917 "\377\377\377\000" # Uses: 0 00:08:11.917 "\001\000\000\000\000\000\000\001" # Uses: 1 00:08:11.917 ###### End of recommended dictionary. ###### 00:08:11.917 Done 68 runs in 2 second(s) 00:08:11.917 10:39:28 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_12.conf 00:08:11.917 10:39:28 -- ../common.sh@72 -- # (( i++ )) 00:08:11.917 10:39:28 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:11.917 10:39:28 -- ../common.sh@73 -- # start_llvm_fuzz 13 1 0x1 00:08:11.917 10:39:28 -- nvmf/run.sh@23 -- # local fuzzer_type=13 00:08:11.917 10:39:28 -- nvmf/run.sh@24 -- # local timen=1 00:08:11.917 10:39:28 -- nvmf/run.sh@25 -- # local core=0x1 00:08:11.917 10:39:28 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:08:11.917 10:39:28 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_13.conf 00:08:11.917 10:39:28 -- nvmf/run.sh@29 -- # printf %02d 13 00:08:11.917 10:39:28 -- nvmf/run.sh@29 -- # port=4413 00:08:11.917 10:39:28 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:08:11.917 10:39:28 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4413' 00:08:11.917 10:39:28 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4413"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:11.917 10:39:28 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4413' -c /tmp/fuzz_json_13.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 -Z 13 -r /var/tmp/spdk13.sock 00:08:11.917 [2024-07-13 10:39:28.274504] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:08:11.917 [2024-07-13 10:39:28.274569] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1995223 ] 00:08:11.917 EAL: No free 2048 kB hugepages reported on node 1 00:08:12.176 [2024-07-13 10:39:28.449890] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:12.176 [2024-07-13 10:39:28.469262] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:12.176 [2024-07-13 10:39:28.469382] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:12.176 [2024-07-13 10:39:28.520846] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:12.176 [2024-07-13 10:39:28.537127] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4413 *** 00:08:12.176 INFO: Running with entropic power schedule (0xFF, 100). 00:08:12.176 INFO: Seed: 501486932 00:08:12.436 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x26ac74c, 0x26ffca9), 00:08:12.436 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x26ffcb0,0x2c35280), 00:08:12.436 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:08:12.436 INFO: A corpus is not provided, starting from an empty corpus 00:08:12.436 #2 INITED exec/s: 0 rss: 60Mb 00:08:12.436 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:12.436 This may also happen if the target rejected all inputs we tried so far 00:08:12.436 [2024-07-13 10:39:28.592525] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:e3d0d0d0 cdw11:d0d0d0d0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.436 [2024-07-13 10:39:28.592557] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.436 [2024-07-13 10:39:28.592612] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:d0d0d0d0 cdw11:d0d0d0d0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.436 [2024-07-13 10:39:28.592627] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.695 NEW_FUNC[1/670]: 0x4b0b80 in fuzz_admin_directive_receive_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:257 00:08:12.695 NEW_FUNC[2/670]: 0x4dac50 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:12.695 #5 NEW cov: 11509 ft: 11510 corp: 2/22b lim: 40 exec/s: 0 rss: 68Mb L: 21/21 MS: 3 ChangeByte-CopyPart-InsertRepeatedBytes- 00:08:12.695 [2024-07-13 10:39:28.913523] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:e3d02cd0 cdw11:d0d0d0d0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.695 [2024-07-13 10:39:28.913586] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.695 [2024-07-13 10:39:28.913688] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:d0d0d0d0 cdw11:d0d0d0d0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.695 [2024-07-13 10:39:28.913718] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.695 #11 NEW cov: 11622 ft: 12243 corp: 3/43b lim: 40 exec/s: 0 rss: 68Mb L: 21/21 MS: 1 ChangeByte- 00:08:12.695 [2024-07-13 10:39:28.963466] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:e3d02cd0 cdw11:d0d0d0d0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.695 [2024-07-13 10:39:28.963494] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.695 [2024-07-13 10:39:28.963551] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:d0d0d0d0 cdw11:d0d0d0d0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.695 [2024-07-13 10:39:28.963565] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.695 [2024-07-13 10:39:28.963618] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:d0d0e3d0 cdw11:d0d0d0d0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.695 [2024-07-13 10:39:28.963631] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.695 #12 NEW cov: 11628 ft: 12669 corp: 4/69b lim: 40 exec/s: 0 rss: 68Mb L: 26/26 MS: 1 CrossOver- 00:08:12.695 [2024-07-13 10:39:29.003455] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:e3d0d0d0 cdw11:302f2f2f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.695 [2024-07-13 10:39:29.003481] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.695 [2024-07-13 10:39:29.003535] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:2f2f2f34 cdw11:d0d0d0d0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.695 [2024-07-13 10:39:29.003550] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.695 #18 NEW cov: 11713 ft: 12880 corp: 5/90b lim: 40 exec/s: 0 rss: 68Mb L: 21/26 MS: 1 ChangeBinInt- 00:08:12.695 [2024-07-13 10:39:29.043526] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:e3d02cd0 cdw11:d0d0d0d0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.695 [2024-07-13 10:39:29.043551] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.695 [2024-07-13 10:39:29.043606] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:d0d0d0d0 cdw11:d0d0d0e3 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.695 [2024-07-13 10:39:29.043622] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.695 #19 NEW cov: 11713 ft: 13026 corp: 6/106b lim: 40 exec/s: 0 rss: 68Mb L: 16/26 MS: 1 EraseBytes- 00:08:12.955 [2024-07-13 10:39:29.083647] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:e3d0d0d0 cdw11:0ad0d0d0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.955 [2024-07-13 10:39:29.083674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.955 [2024-07-13 10:39:29.083731] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:d0d0d0d0 cdw11:d0d0d0d0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.955 [2024-07-13 10:39:29.083745] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.955 #20 NEW cov: 11713 ft: 13148 corp: 7/128b lim: 40 exec/s: 0 rss: 68Mb L: 22/26 MS: 1 CrossOver- 00:08:12.955 [2024-07-13 10:39:29.113733] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:e3d02cd0 cdw11:d0d0d0d0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.955 [2024-07-13 10:39:29.113758] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.955 [2024-07-13 10:39:29.113812] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:d0d0d0d0 cdw11:d0d0d0e3 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.955 [2024-07-13 10:39:29.113826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.955 #21 NEW cov: 11713 ft: 13293 corp: 8/144b lim: 40 exec/s: 0 rss: 69Mb L: 16/26 MS: 1 ShuffleBytes- 00:08:12.955 [2024-07-13 10:39:29.153824] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:e3d02cd0 cdw11:d0d0d0d0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.955 [2024-07-13 10:39:29.153850] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.955 [2024-07-13 10:39:29.153903] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:d0d0d0d0 cdw11:d0d0d0e3 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.955 [2024-07-13 10:39:29.153917] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.955 #22 NEW cov: 11713 ft: 13323 corp: 9/160b lim: 40 exec/s: 0 rss: 69Mb L: 16/26 MS: 1 ShuffleBytes- 00:08:12.955 [2024-07-13 10:39:29.193970] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:e3d02cd0 cdw11:d0d0d0d0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.955 [2024-07-13 10:39:29.193995] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.955 [2024-07-13 10:39:29.194053] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:d0d0d0d0 cdw11:d0d0d0e3 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.955 [2024-07-13 10:39:29.194067] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.955 #23 NEW cov: 11713 ft: 13333 corp: 10/176b lim: 40 exec/s: 0 rss: 69Mb L: 16/26 MS: 1 ShuffleBytes- 00:08:12.955 [2024-07-13 10:39:29.223925] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:e3d02cd0 cdw11:d0d0d0d0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.955 [2024-07-13 10:39:29.223951] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.955 #24 NEW cov: 11713 ft: 13737 corp: 11/187b lim: 40 exec/s: 0 rss: 69Mb L: 11/26 MS: 1 EraseBytes- 00:08:12.955 [2024-07-13 10:39:29.264199] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:e3d02cd0 cdw11:d0d0d0d0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.955 [2024-07-13 10:39:29.264227] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.955 [2024-07-13 10:39:29.264284] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:d0d0d0d0 cdw11:d0d0d0d0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.955 [2024-07-13 10:39:29.264297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.955 #25 NEW cov: 11713 ft: 13758 corp: 12/205b lim: 40 exec/s: 0 rss: 69Mb L: 18/26 MS: 1 EraseBytes- 00:08:12.955 [2024-07-13 10:39:29.294250] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:e3d02cd0 cdw11:fdd0d0d0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.955 [2024-07-13 10:39:29.294275] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.955 [2024-07-13 10:39:29.294330] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:d0d0d0d0 cdw11:d0d0d0d0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.955 [2024-07-13 10:39:29.294343] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.955 #26 NEW cov: 11713 ft: 13810 corp: 13/224b lim: 40 exec/s: 0 rss: 69Mb L: 19/26 MS: 1 InsertByte- 00:08:12.955 [2024-07-13 10:39:29.334378] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:e3d02cd0 cdw11:d0d0d0d0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.955 [2024-07-13 10:39:29.334403] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.955 [2024-07-13 10:39:29.334461] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:d0d0d0d0 cdw11:d0d0d0d0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.955 [2024-07-13 10:39:29.334475] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.241 #27 NEW cov: 11713 ft: 13834 corp: 14/245b lim: 40 exec/s: 0 rss: 69Mb L: 21/26 MS: 1 ShuffleBytes- 00:08:13.241 [2024-07-13 10:39:29.364468] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:e3d02cd0 cdw11:d0d0d0d0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.241 [2024-07-13 10:39:29.364494] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.241 [2024-07-13 10:39:29.364552] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:d0d0d0d0 cdw11:d0d0d0e3 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.241 [2024-07-13 10:39:29.364565] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.241 #28 NEW cov: 11713 ft: 13855 corp: 15/261b lim: 40 exec/s: 0 rss: 69Mb L: 16/26 MS: 1 ShuffleBytes- 00:08:13.241 [2024-07-13 10:39:29.404555] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:e3d02cd0 cdw11:d0d0d0d0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.241 [2024-07-13 10:39:29.404579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.241 [2024-07-13 10:39:29.404651] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:d0d090d0 cdw11:d0d0d0e3 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.241 [2024-07-13 10:39:29.404665] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.241 #29 NEW cov: 11713 ft: 13889 corp: 16/277b lim: 40 exec/s: 0 rss: 69Mb L: 16/26 MS: 1 ChangeBit- 00:08:13.241 [2024-07-13 10:39:29.444680] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:e3d02cd0 cdw11:d0d0d0d0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.241 [2024-07-13 10:39:29.444709] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.241 [2024-07-13 10:39:29.444765] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:d0d0d0d0 cdw11:d0d05151 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.241 [2024-07-13 10:39:29.444778] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.241 #30 NEW cov: 11713 ft: 13951 corp: 17/300b lim: 40 exec/s: 0 rss: 69Mb L: 23/26 MS: 1 InsertRepeatedBytes- 00:08:13.241 [2024-07-13 10:39:29.474921] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:e3d02cd0 cdw11:d0d0d0d0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.241 [2024-07-13 10:39:29.474946] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.241 [2024-07-13 10:39:29.475002] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:d0d0d0d0 cdw11:d0d0d0d0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.241 [2024-07-13 10:39:29.475016] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.241 [2024-07-13 10:39:29.475072] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:d0d0e3d0 cdw11:d0d0ccd0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.241 [2024-07-13 10:39:29.475086] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:13.241 NEW_FUNC[1/1]: 0x197bcf0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:13.241 #31 NEW cov: 11736 ft: 14041 corp: 18/327b lim: 40 exec/s: 0 rss: 69Mb L: 27/27 MS: 1 InsertByte- 00:08:13.241 [2024-07-13 10:39:29.514802] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:e3d0d0d0 cdw11:d0d0d0d0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.241 [2024-07-13 10:39:29.514827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.241 #32 NEW cov: 11736 ft: 14111 corp: 19/339b lim: 40 exec/s: 0 rss: 69Mb L: 12/27 MS: 1 EraseBytes- 00:08:13.241 [2024-07-13 10:39:29.555044] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:e3d02cd0 cdw11:d0d0d0d0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.241 [2024-07-13 10:39:29.555069] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.241 [2024-07-13 10:39:29.555139] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:d0d0d0d0 cdw11:d0d0d0d0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.241 [2024-07-13 10:39:29.555153] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.241 #33 NEW cov: 11736 ft: 14127 corp: 20/357b lim: 40 exec/s: 0 rss: 69Mb L: 18/27 MS: 1 ChangeBinInt- 00:08:13.241 [2024-07-13 10:39:29.595113] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:e3d02cd0 cdw11:d0d0d0d0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.241 [2024-07-13 10:39:29.595138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.241 [2024-07-13 10:39:29.595190] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:d0d0d0d0 cdw11:d0d0d0d0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.241 [2024-07-13 10:39:29.595204] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.241 #34 NEW cov: 11736 ft: 14145 corp: 21/378b lim: 40 exec/s: 34 rss: 69Mb L: 21/27 MS: 1 ShuffleBytes- 00:08:13.500 [2024-07-13 10:39:29.635385] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:e3d02cd0 cdw11:d0d0d0d0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.500 [2024-07-13 10:39:29.635413] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.500 [2024-07-13 10:39:29.635473] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:d0d0d0d0 cdw11:d0d0d0d0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.500 [2024-07-13 10:39:29.635486] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.501 [2024-07-13 10:39:29.635535] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:d0d0e3d0 cdw11:d0d0d0d0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.501 [2024-07-13 10:39:29.635548] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:13.501 #35 NEW cov: 11736 ft: 14173 corp: 22/404b lim: 40 exec/s: 35 rss: 69Mb L: 26/27 MS: 1 CopyPart- 00:08:13.501 [2024-07-13 10:39:29.675479] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:2f2f2fe3 cdw11:d0d0d030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.501 [2024-07-13 10:39:29.675505] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.501 [2024-07-13 10:39:29.675559] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:2f2f2f2f cdw11:2f2f34d0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.501 [2024-07-13 10:39:29.675572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.501 [2024-07-13 10:39:29.675624] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:d0d0d0d0 cdw11:d0d0d0e3 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.501 [2024-07-13 10:39:29.675638] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:13.501 #36 NEW cov: 11736 ft: 14227 corp: 23/428b lim: 40 exec/s: 36 rss: 69Mb L: 24/27 MS: 1 CopyPart- 00:08:13.501 [2024-07-13 10:39:29.715766] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:14141414 cdw11:14141414 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.501 [2024-07-13 10:39:29.715790] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.501 [2024-07-13 10:39:29.715845] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:14141414 cdw11:14141414 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.501 [2024-07-13 10:39:29.715858] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.501 [2024-07-13 10:39:29.715910] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:14141414 cdw11:14141414 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.501 [2024-07-13 10:39:29.715923] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:13.501 [2024-07-13 10:39:29.715991] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:14141414 cdw11:14141414 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.501 [2024-07-13 10:39:29.716005] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:13.501 #37 NEW cov: 11736 ft: 14681 corp: 24/461b lim: 40 exec/s: 37 rss: 69Mb L: 33/33 MS: 1 InsertRepeatedBytes- 00:08:13.501 [2024-07-13 10:39:29.755581] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:e3d02cd0 cdw11:fdd0d0d0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.501 [2024-07-13 10:39:29.755605] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.501 [2024-07-13 10:39:29.755666] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:d0d0d0d0 cdw11:d02cd0fd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.501 [2024-07-13 10:39:29.755680] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.501 #38 NEW cov: 11736 ft: 14713 corp: 25/480b lim: 40 exec/s: 38 rss: 70Mb L: 19/33 MS: 1 CopyPart- 00:08:13.501 [2024-07-13 10:39:29.795753] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:e3d02ad0 cdw11:302f2f2f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.501 [2024-07-13 10:39:29.795778] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.501 [2024-07-13 10:39:29.795829] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:2f2f2f34 cdw11:d0d0d0d0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.501 [2024-07-13 10:39:29.795842] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.501 #39 NEW cov: 11736 ft: 14767 corp: 26/501b lim: 40 exec/s: 39 rss: 70Mb L: 21/33 MS: 1 ChangeByte- 00:08:13.501 [2024-07-13 10:39:29.835850] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:e3d0d02c cdw11:d0d0d0d0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.501 [2024-07-13 10:39:29.835875] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.501 [2024-07-13 10:39:29.835927] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:d0d0d0d0 cdw11:d0d05151 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.501 [2024-07-13 10:39:29.835941] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.501 #40 NEW cov: 11736 ft: 14805 corp: 27/524b lim: 40 exec/s: 40 rss: 70Mb L: 23/33 MS: 1 ShuffleBytes- 00:08:13.501 [2024-07-13 10:39:29.876000] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:e3d0d0d0 cdw11:d0d0d0d0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.501 [2024-07-13 10:39:29.876027] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.501 [2024-07-13 10:39:29.876080] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:d0d0d0d0 cdw11:d0d0d0d0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.501 [2024-07-13 10:39:29.876093] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.761 #41 NEW cov: 11736 ft: 14817 corp: 28/545b lim: 40 exec/s: 41 rss: 70Mb L: 21/33 MS: 1 ShuffleBytes- 00:08:13.761 [2024-07-13 10:39:29.906058] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:e3d02ccc cdw11:d0d0d0d0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.761 [2024-07-13 10:39:29.906083] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.761 [2024-07-13 10:39:29.906134] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:d0d0d0d0 cdw11:d0d0d0e3 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.761 [2024-07-13 10:39:29.906148] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.761 #42 NEW cov: 11736 ft: 14842 corp: 29/561b lim: 40 exec/s: 42 rss: 70Mb L: 16/33 MS: 1 ChangeBinInt- 00:08:13.761 [2024-07-13 10:39:29.946159] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:e3d0d0d0 cdw11:0ad0d0d0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.761 [2024-07-13 10:39:29.946184] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.761 [2024-07-13 10:39:29.946238] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:d0d0d0d0 cdw11:d0d0d0d0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.761 [2024-07-13 10:39:29.946254] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.761 #43 NEW cov: 11736 ft: 14846 corp: 30/583b lim: 40 exec/s: 43 rss: 70Mb L: 22/33 MS: 1 ShuffleBytes- 00:08:13.761 [2024-07-13 10:39:29.986275] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:e3d02ad0 cdw11:302f2f2f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.761 [2024-07-13 10:39:29.986300] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.761 [2024-07-13 10:39:29.986353] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:2f2f2f34 cdw11:d0d0d0d0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.761 [2024-07-13 10:39:29.986367] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.761 #44 NEW cov: 11736 ft: 14869 corp: 31/604b lim: 40 exec/s: 44 rss: 70Mb L: 21/33 MS: 1 ShuffleBytes- 00:08:13.761 [2024-07-13 10:39:30.026261] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:e3d02cd0 cdw11:d0e3d0d0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.761 [2024-07-13 10:39:30.026286] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.761 #45 NEW cov: 11736 ft: 14872 corp: 32/617b lim: 40 exec/s: 45 rss: 70Mb L: 13/33 MS: 1 CrossOver- 00:08:13.761 [2024-07-13 10:39:30.066498] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:e3d02c01 cdw11:00d0fdd0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.761 [2024-07-13 10:39:30.066524] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.761 [2024-07-13 10:39:30.066594] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:d0d0d0d0 cdw11:d0d0d0d0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.761 [2024-07-13 10:39:30.066608] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.761 #46 NEW cov: 11736 ft: 14882 corp: 33/638b lim: 40 exec/s: 46 rss: 70Mb L: 21/33 MS: 1 CMP- DE: "\001\000"- 00:08:13.761 [2024-07-13 10:39:30.106609] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:e3d02cf0 cdw11:d0d0d0d0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.761 [2024-07-13 10:39:30.106635] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.761 [2024-07-13 10:39:30.106706] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:d0d0d0d0 cdw11:d0d0d0e3 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.761 [2024-07-13 10:39:30.106720] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.761 #47 NEW cov: 11736 ft: 14888 corp: 34/654b lim: 40 exec/s: 47 rss: 70Mb L: 16/33 MS: 1 ChangeBit- 00:08:13.761 [2024-07-13 10:39:30.146700] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:e3d02ccc cdw11:d0d0d0d0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.761 [2024-07-13 10:39:30.146725] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.761 [2024-07-13 10:39:30.146778] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:d0d0d0d0 cdw11:d0d0d0e3 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.761 [2024-07-13 10:39:30.146792] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.021 #48 NEW cov: 11736 ft: 14900 corp: 35/670b lim: 40 exec/s: 48 rss: 70Mb L: 16/33 MS: 1 ShuffleBytes- 00:08:14.021 [2024-07-13 10:39:30.186800] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:e3d02c01 cdw11:00d0fdd0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.021 [2024-07-13 10:39:30.186827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.022 [2024-07-13 10:39:30.186884] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:d0d0d0d0 cdw11:d0d0d0d0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.022 [2024-07-13 10:39:30.186897] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.022 #49 NEW cov: 11736 ft: 14914 corp: 36/686b lim: 40 exec/s: 49 rss: 70Mb L: 16/33 MS: 1 EraseBytes- 00:08:14.022 [2024-07-13 10:39:30.226967] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:e3d02cd0 cdw11:d0d0c0d0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.022 [2024-07-13 10:39:30.226992] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.022 [2024-07-13 10:39:30.227046] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:d0d0d0d0 cdw11:d0d0d0d0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.022 [2024-07-13 10:39:30.227060] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.022 #50 NEW cov: 11736 ft: 14930 corp: 37/704b lim: 40 exec/s: 50 rss: 70Mb L: 18/33 MS: 1 ChangeBit- 00:08:14.022 [2024-07-13 10:39:30.266972] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:e3d0d098 cdw11:d0d0d0d0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.022 [2024-07-13 10:39:30.266996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.022 #51 NEW cov: 11736 ft: 14936 corp: 38/717b lim: 40 exec/s: 51 rss: 70Mb L: 13/33 MS: 1 InsertByte- 00:08:14.022 [2024-07-13 10:39:30.307055] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:e3d02cd0 cdw11:d0d0d0d0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.022 [2024-07-13 10:39:30.307081] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.022 #52 NEW cov: 11736 ft: 14943 corp: 39/726b lim: 40 exec/s: 52 rss: 70Mb L: 9/33 MS: 1 EraseBytes- 00:08:14.022 [2024-07-13 10:39:30.347351] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:e3d02cd0 cdw11:d0d0d0d0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.022 [2024-07-13 10:39:30.347378] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.022 [2024-07-13 10:39:30.347435] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:d0d0d0d0 cdw11:d0d0d060 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.022 [2024-07-13 10:39:30.347453] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.022 #53 NEW cov: 11736 ft: 14947 corp: 40/744b lim: 40 exec/s: 53 rss: 70Mb L: 18/33 MS: 1 ChangeByte- 00:08:14.022 [2024-07-13 10:39:30.377331] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:e3d02cd0 cdw11:d0d0d0d0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.022 [2024-07-13 10:39:30.377356] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.022 [2024-07-13 10:39:30.377414] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:d0d0d0d0 cdw11:d2d0d060 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.022 [2024-07-13 10:39:30.377428] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.022 #54 NEW cov: 11736 ft: 14970 corp: 41/762b lim: 40 exec/s: 54 rss: 70Mb L: 18/33 MS: 1 ChangeBit- 00:08:14.282 [2024-07-13 10:39:30.417341] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:e3d02cd0 cdw11:d090d0d0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.282 [2024-07-13 10:39:30.417368] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.282 #55 NEW cov: 11736 ft: 14975 corp: 42/773b lim: 40 exec/s: 55 rss: 70Mb L: 11/33 MS: 1 EraseBytes- 00:08:14.282 [2024-07-13 10:39:30.457609] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:e3d02cd0 cdw11:d0d0d0d0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.282 [2024-07-13 10:39:30.457651] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.282 [2024-07-13 10:39:30.457710] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:d0d0d0d0 cdw11:d0d0d066 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.282 [2024-07-13 10:39:30.457725] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.282 #56 NEW cov: 11736 ft: 15008 corp: 43/790b lim: 40 exec/s: 56 rss: 70Mb L: 17/33 MS: 1 InsertByte- 00:08:14.282 [2024-07-13 10:39:30.497616] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:e3d02cd0 cdw11:d02cd0d0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.282 [2024-07-13 10:39:30.497642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.282 #57 NEW cov: 11736 ft: 15019 corp: 44/803b lim: 40 exec/s: 57 rss: 70Mb L: 13/33 MS: 1 CopyPart- 00:08:14.282 [2024-07-13 10:39:30.537847] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:e3d02cd0 cdw11:d0d0d0d0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.282 [2024-07-13 10:39:30.537872] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.282 [2024-07-13 10:39:30.537944] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:d0d0d0d0 cdw11:d0d00100 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.282 [2024-07-13 10:39:30.537958] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.282 #58 NEW cov: 11736 ft: 15022 corp: 45/821b lim: 40 exec/s: 58 rss: 70Mb L: 18/33 MS: 1 PersAutoDict- DE: "\001\000"- 00:08:14.282 [2024-07-13 10:39:30.568001] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:e3d02cd0 cdw11:d0d0d0d0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.282 [2024-07-13 10:39:30.568026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.282 [2024-07-13 10:39:30.568081] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:d0d0d0d0 cdw11:d02cd0d0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.282 [2024-07-13 10:39:30.568095] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.282 [2024-07-13 10:39:30.568148] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:d0d0d0d0 cdw11:d0d0d0d0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.282 [2024-07-13 10:39:30.568162] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.282 #59 NEW cov: 11736 ft: 15033 corp: 46/846b lim: 40 exec/s: 29 rss: 70Mb L: 25/33 MS: 1 CopyPart- 00:08:14.282 #59 DONE cov: 11736 ft: 15033 corp: 46/846b lim: 40 exec/s: 29 rss: 70Mb 00:08:14.282 ###### Recommended dictionary. ###### 00:08:14.282 "\001\000" # Uses: 1 00:08:14.282 ###### End of recommended dictionary. ###### 00:08:14.282 Done 59 runs in 2 second(s) 00:08:14.542 10:39:30 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_13.conf 00:08:14.542 10:39:30 -- ../common.sh@72 -- # (( i++ )) 00:08:14.542 10:39:30 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:14.542 10:39:30 -- ../common.sh@73 -- # start_llvm_fuzz 14 1 0x1 00:08:14.542 10:39:30 -- nvmf/run.sh@23 -- # local fuzzer_type=14 00:08:14.542 10:39:30 -- nvmf/run.sh@24 -- # local timen=1 00:08:14.542 10:39:30 -- nvmf/run.sh@25 -- # local core=0x1 00:08:14.542 10:39:30 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:08:14.542 10:39:30 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_14.conf 00:08:14.542 10:39:30 -- nvmf/run.sh@29 -- # printf %02d 14 00:08:14.542 10:39:30 -- nvmf/run.sh@29 -- # port=4414 00:08:14.542 10:39:30 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:08:14.542 10:39:30 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4414' 00:08:14.542 10:39:30 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4414"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:14.542 10:39:30 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4414' -c /tmp/fuzz_json_14.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 -Z 14 -r /var/tmp/spdk14.sock 00:08:14.542 [2024-07-13 10:39:30.726346] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:08:14.542 [2024-07-13 10:39:30.726414] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1995622 ] 00:08:14.542 EAL: No free 2048 kB hugepages reported on node 1 00:08:14.542 [2024-07-13 10:39:30.910506] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:14.801 [2024-07-13 10:39:30.930622] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:14.801 [2024-07-13 10:39:30.930750] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:14.801 [2024-07-13 10:39:30.982179] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:14.801 [2024-07-13 10:39:30.998482] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4414 *** 00:08:14.801 INFO: Running with entropic power schedule (0xFF, 100). 00:08:14.801 INFO: Seed: 2963487738 00:08:14.801 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x26ac74c, 0x26ffca9), 00:08:14.801 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x26ffcb0,0x2c35280), 00:08:14.801 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:08:14.801 INFO: A corpus is not provided, starting from an empty corpus 00:08:14.801 #2 INITED exec/s: 0 rss: 60Mb 00:08:14.801 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:14.801 This may also happen if the target rejected all inputs we tried so far 00:08:14.801 [2024-07-13 10:39:31.065259] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:0000001a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.801 [2024-07-13 10:39:31.065312] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.801 [2024-07-13 10:39:31.065460] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:0000001d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.801 [2024-07-13 10:39:31.065478] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.801 [2024-07-13 10:39:31.065605] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:0000001d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.801 [2024-07-13 10:39:31.065624] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.060 NEW_FUNC[1/671]: 0x4b2740 in fuzz_admin_set_features_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:392 00:08:15.060 NEW_FUNC[2/671]: 0x4dac50 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:15.060 #4 NEW cov: 11502 ft: 11502 corp: 2/28b lim: 35 exec/s: 0 rss: 68Mb L: 27/27 MS: 2 ChangeBit-InsertRepeatedBytes- 00:08:15.060 [2024-07-13 10:39:31.396177] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:0000001a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.060 [2024-07-13 10:39:31.396226] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.060 [2024-07-13 10:39:31.396381] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:0000001d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.060 [2024-07-13 10:39:31.396407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.060 [2024-07-13 10:39:31.396555] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:0000001d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.060 [2024-07-13 10:39:31.396577] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.060 #5 NEW cov: 11616 ft: 12067 corp: 3/55b lim: 35 exec/s: 0 rss: 68Mb L: 27/27 MS: 1 ChangeBit- 00:08:15.319 [2024-07-13 10:39:31.456427] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:0000001a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.319 [2024-07-13 10:39:31.456461] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.320 [2024-07-13 10:39:31.456599] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:0000001d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.320 [2024-07-13 10:39:31.456617] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.320 [2024-07-13 10:39:31.456743] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:0000001d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.320 [2024-07-13 10:39:31.456761] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.320 [2024-07-13 10:39:31.456880] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:0000001d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.320 [2024-07-13 10:39:31.456898] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:15.320 #6 NEW cov: 11622 ft: 12556 corp: 4/87b lim: 35 exec/s: 0 rss: 68Mb L: 32/32 MS: 1 CopyPart- 00:08:15.320 [2024-07-13 10:39:31.516041] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:0000001a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.320 [2024-07-13 10:39:31.516070] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.320 [2024-07-13 10:39:31.516211] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:0000001d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.320 [2024-07-13 10:39:31.516228] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.320 #7 NEW cov: 11707 ft: 12997 corp: 5/103b lim: 35 exec/s: 0 rss: 68Mb L: 16/32 MS: 1 EraseBytes- 00:08:15.320 [2024-07-13 10:39:31.576794] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:0000005a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.320 [2024-07-13 10:39:31.576823] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.320 [2024-07-13 10:39:31.576970] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:0000005a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.320 [2024-07-13 10:39:31.576986] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.320 [2024-07-13 10:39:31.577119] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:0000005a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.320 [2024-07-13 10:39:31.577140] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.320 [2024-07-13 10:39:31.577269] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:0000005a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.320 [2024-07-13 10:39:31.577286] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:15.320 #11 NEW cov: 11707 ft: 13073 corp: 6/137b lim: 35 exec/s: 0 rss: 68Mb L: 34/34 MS: 4 ChangeBit-ShuffleBytes-ChangeBit-InsertRepeatedBytes- 00:08:15.320 [2024-07-13 10:39:31.626181] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:8000008c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.320 [2024-07-13 10:39:31.626216] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.320 #14 NEW cov: 11714 ft: 13897 corp: 7/147b lim: 35 exec/s: 0 rss: 68Mb L: 10/34 MS: 3 ShuffleBytes-ShuffleBytes-InsertRepeatedBytes- 00:08:15.320 [2024-07-13 10:39:31.667222] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:0000001a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.320 [2024-07-13 10:39:31.667250] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.320 [2024-07-13 10:39:31.667390] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:0000001d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.320 [2024-07-13 10:39:31.667409] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.320 [2024-07-13 10:39:31.667533] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:0000001d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.320 [2024-07-13 10:39:31.667562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.320 [2024-07-13 10:39:31.667691] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:0000001d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.320 [2024-07-13 10:39:31.667710] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:15.320 [2024-07-13 10:39:31.667834] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:8 cdw10:0000001d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.320 [2024-07-13 10:39:31.667850] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:15.320 #20 NEW cov: 11714 ft: 14130 corp: 8/182b lim: 35 exec/s: 0 rss: 68Mb L: 35/35 MS: 1 CopyPart- 00:08:15.579 [2024-07-13 10:39:31.717034] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:0000001d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.579 [2024-07-13 10:39:31.717062] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.579 [2024-07-13 10:39:31.717206] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:0000001d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.579 [2024-07-13 10:39:31.717223] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.579 NEW_FUNC[1/1]: 0x4d3fb0 in feat_async_event_cfg /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:346 00:08:15.579 #23 NEW cov: 11814 ft: 14303 corp: 9/204b lim: 35 exec/s: 0 rss: 68Mb L: 22/35 MS: 3 ChangeByte-ChangeByte-CrossOver- 00:08:15.579 [2024-07-13 10:39:31.757204] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:0000001a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.579 [2024-07-13 10:39:31.757233] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.579 [2024-07-13 10:39:31.757362] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:0000001d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.579 [2024-07-13 10:39:31.757382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.579 [2024-07-13 10:39:31.757514] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:0000001d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.579 [2024-07-13 10:39:31.757531] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.579 [2024-07-13 10:39:31.757671] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:0000001d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.579 [2024-07-13 10:39:31.757686] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:15.579 #24 NEW cov: 11814 ft: 14414 corp: 10/232b lim: 35 exec/s: 0 rss: 68Mb L: 28/35 MS: 1 InsertByte- 00:08:15.579 [2024-07-13 10:39:31.797513] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:0000005a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.579 [2024-07-13 10:39:31.797539] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.579 [2024-07-13 10:39:31.797677] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:0000005a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.579 [2024-07-13 10:39:31.797693] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.579 [2024-07-13 10:39:31.797828] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:0000005a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.579 [2024-07-13 10:39:31.797845] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.579 [2024-07-13 10:39:31.797970] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:0000005a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.579 [2024-07-13 10:39:31.797990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:15.579 #25 NEW cov: 11814 ft: 14461 corp: 11/266b lim: 35 exec/s: 0 rss: 69Mb L: 34/35 MS: 1 ShuffleBytes- 00:08:15.579 [2024-07-13 10:39:31.837402] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:0000001d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.579 [2024-07-13 10:39:31.837431] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.579 [2024-07-13 10:39:31.837556] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:0000001a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.579 [2024-07-13 10:39:31.837572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.579 [2024-07-13 10:39:31.837707] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:0000001d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.579 [2024-07-13 10:39:31.837724] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.579 [2024-07-13 10:39:31.837847] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:0000001d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.579 [2024-07-13 10:39:31.837862] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:15.579 [2024-07-13 10:39:31.837991] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:8 cdw10:0000001d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.579 [2024-07-13 10:39:31.838007] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:15.579 #26 NEW cov: 11814 ft: 14505 corp: 12/301b lim: 35 exec/s: 0 rss: 69Mb L: 35/35 MS: 1 CopyPart- 00:08:15.579 [2024-07-13 10:39:31.877301] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:0000001a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.579 [2024-07-13 10:39:31.877333] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.579 [2024-07-13 10:39:31.877464] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:0000001d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.579 [2024-07-13 10:39:31.877483] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.579 [2024-07-13 10:39:31.877622] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:0000001d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.579 [2024-07-13 10:39:31.877638] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.579 [2024-07-13 10:39:31.877768] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:0000001d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.579 [2024-07-13 10:39:31.877784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:15.579 #27 NEW cov: 11814 ft: 14560 corp: 13/333b lim: 35 exec/s: 0 rss: 69Mb L: 32/35 MS: 1 CopyPart- 00:08:15.579 [2024-07-13 10:39:31.927077] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:8000008c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.579 [2024-07-13 10:39:31.927110] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.579 NEW_FUNC[1/1]: 0x197bcf0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:15.579 #28 NEW cov: 11837 ft: 14649 corp: 14/344b lim: 35 exec/s: 0 rss: 69Mb L: 11/35 MS: 1 InsertByte- 00:08:15.838 [2024-07-13 10:39:31.977069] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:0000001a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.838 [2024-07-13 10:39:31.977096] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.838 [2024-07-13 10:39:31.977239] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:0000001d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.838 [2024-07-13 10:39:31.977257] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.838 #29 NEW cov: 11837 ft: 14669 corp: 15/364b lim: 35 exec/s: 0 rss: 69Mb L: 20/35 MS: 1 CopyPart- 00:08:15.838 [2024-07-13 10:39:32.028192] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000053 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.838 [2024-07-13 10:39:32.028219] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.838 [2024-07-13 10:39:32.028359] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000053 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.838 [2024-07-13 10:39:32.028379] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.838 [2024-07-13 10:39:32.028507] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000053 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.838 [2024-07-13 10:39:32.028525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.838 [2024-07-13 10:39:32.028650] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000053 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.838 [2024-07-13 10:39:32.028666] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:15.838 #30 NEW cov: 11837 ft: 14707 corp: 16/398b lim: 35 exec/s: 30 rss: 69Mb L: 34/35 MS: 1 InsertRepeatedBytes- 00:08:15.838 [2024-07-13 10:39:32.087891] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:0000001a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.838 [2024-07-13 10:39:32.087924] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.838 [2024-07-13 10:39:32.088053] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000e2 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.838 [2024-07-13 10:39:32.088078] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.838 #31 NEW cov: 11837 ft: 14744 corp: 17/414b lim: 35 exec/s: 31 rss: 69Mb L: 16/35 MS: 1 ChangeBinInt- 00:08:15.838 [2024-07-13 10:39:32.138558] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:0000001a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.838 [2024-07-13 10:39:32.138586] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.838 [2024-07-13 10:39:32.138726] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:0000001d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.838 [2024-07-13 10:39:32.138744] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.838 [2024-07-13 10:39:32.138881] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:0000001d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.838 [2024-07-13 10:39:32.138898] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.838 [2024-07-13 10:39:32.139032] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:0000001d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.838 [2024-07-13 10:39:32.139050] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:15.839 #32 NEW cov: 11837 ft: 14829 corp: 18/446b lim: 35 exec/s: 32 rss: 69Mb L: 32/35 MS: 1 ChangeBit- 00:08:15.839 [2024-07-13 10:39:32.188698] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:0000001a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.839 [2024-07-13 10:39:32.188730] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.839 [2024-07-13 10:39:32.188873] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:0000001d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.839 [2024-07-13 10:39:32.188892] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.839 [2024-07-13 10:39:32.189038] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:0000001d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.839 [2024-07-13 10:39:32.189055] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.839 [2024-07-13 10:39:32.189186] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:0000001d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.839 [2024-07-13 10:39:32.189205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:15.839 #33 NEW cov: 11837 ft: 14848 corp: 19/478b lim: 35 exec/s: 33 rss: 69Mb L: 32/35 MS: 1 ChangeByte- 00:08:16.098 [2024-07-13 10:39:32.238799] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:0000001a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.098 [2024-07-13 10:39:32.238828] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.098 [2024-07-13 10:39:32.238963] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:0000001d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.098 [2024-07-13 10:39:32.238980] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.098 [2024-07-13 10:39:32.239083] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:0000001d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.098 [2024-07-13 10:39:32.239108] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:16.098 [2024-07-13 10:39:32.239247] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:0000001d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.098 [2024-07-13 10:39:32.239265] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:16.098 #34 NEW cov: 11837 ft: 14880 corp: 20/510b lim: 35 exec/s: 34 rss: 69Mb L: 32/35 MS: 1 ChangeBit- 00:08:16.098 [2024-07-13 10:39:32.289168] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:0000001d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.098 [2024-07-13 10:39:32.289195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.098 [2024-07-13 10:39:32.289322] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:0000001a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.098 [2024-07-13 10:39:32.289340] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.098 [2024-07-13 10:39:32.289483] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:0000001d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.098 [2024-07-13 10:39:32.289502] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:16.099 [2024-07-13 10:39:32.289640] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:0000001d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.099 [2024-07-13 10:39:32.289656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:16.099 [2024-07-13 10:39:32.289786] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:8 cdw10:0000001d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.099 [2024-07-13 10:39:32.289803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:16.099 #35 NEW cov: 11837 ft: 14884 corp: 21/545b lim: 35 exec/s: 35 rss: 69Mb L: 35/35 MS: 1 CopyPart- 00:08:16.099 [2024-07-13 10:39:32.348934] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:0000001a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.099 [2024-07-13 10:39:32.348961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.099 [2024-07-13 10:39:32.349098] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:0000001d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.099 [2024-07-13 10:39:32.349119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.099 [2024-07-13 10:39:32.349256] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:0000001d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.099 [2024-07-13 10:39:32.349275] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:16.099 #36 NEW cov: 11837 ft: 14907 corp: 22/572b lim: 35 exec/s: 36 rss: 70Mb L: 27/35 MS: 1 ChangeByte- 00:08:16.099 [2024-07-13 10:39:32.399109] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:8000008c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.099 [2024-07-13 10:39:32.399140] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.099 [2024-07-13 10:39:32.399280] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:0000001d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.099 [2024-07-13 10:39:32.399299] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.099 [2024-07-13 10:39:32.399438] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:0000001d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.099 [2024-07-13 10:39:32.399463] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:16.099 #37 NEW cov: 11837 ft: 14967 corp: 23/599b lim: 35 exec/s: 37 rss: 70Mb L: 27/35 MS: 1 CrossOver- 00:08:16.099 [2024-07-13 10:39:32.449759] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:0000001d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.099 [2024-07-13 10:39:32.449790] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.099 [2024-07-13 10:39:32.449927] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:0000001a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.099 [2024-07-13 10:39:32.449944] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.099 [2024-07-13 10:39:32.450083] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:0000001d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.099 [2024-07-13 10:39:32.450100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:16.099 [2024-07-13 10:39:32.450224] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:0000001d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.099 [2024-07-13 10:39:32.450241] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:16.099 [2024-07-13 10:39:32.450367] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:8 cdw10:0000001d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.099 [2024-07-13 10:39:32.450387] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:16.099 #38 NEW cov: 11837 ft: 14980 corp: 24/634b lim: 35 exec/s: 38 rss: 70Mb L: 35/35 MS: 1 ChangeByte- 00:08:16.358 [2024-07-13 10:39:32.509774] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:0000005a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.358 [2024-07-13 10:39:32.509804] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.358 [2024-07-13 10:39:32.509942] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:0000005a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.358 [2024-07-13 10:39:32.509961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.358 [2024-07-13 10:39:32.510100] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:0000005a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.358 [2024-07-13 10:39:32.510117] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:16.358 [2024-07-13 10:39:32.510251] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:8000005a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.358 [2024-07-13 10:39:32.510274] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:16.358 #39 NEW cov: 11837 ft: 14990 corp: 25/668b lim: 35 exec/s: 39 rss: 70Mb L: 34/35 MS: 1 ChangeBit- 00:08:16.358 [2024-07-13 10:39:32.570202] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:0000001a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.358 [2024-07-13 10:39:32.570231] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.359 [2024-07-13 10:39:32.570359] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:0000001d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.359 [2024-07-13 10:39:32.570378] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.359 [2024-07-13 10:39:32.570518] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:0000001d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.359 [2024-07-13 10:39:32.570538] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:16.359 [2024-07-13 10:39:32.570673] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:0000001d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.359 [2024-07-13 10:39:32.570691] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:16.359 [2024-07-13 10:39:32.570828] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:8 cdw10:0000001d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.359 [2024-07-13 10:39:32.570846] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:16.359 #40 NEW cov: 11837 ft: 15008 corp: 26/703b lim: 35 exec/s: 40 rss: 70Mb L: 35/35 MS: 1 ShuffleBytes- 00:08:16.359 [2024-07-13 10:39:32.629787] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:0000001a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.359 [2024-07-13 10:39:32.629817] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.359 [2024-07-13 10:39:32.629952] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:0000001d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.359 [2024-07-13 10:39:32.629970] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.359 [2024-07-13 10:39:32.630105] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:0000001d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.359 [2024-07-13 10:39:32.630123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:16.359 #41 NEW cov: 11837 ft: 15026 corp: 27/730b lim: 35 exec/s: 41 rss: 70Mb L: 27/35 MS: 1 CopyPart- 00:08:16.359 [2024-07-13 10:39:32.679546] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000053 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.359 [2024-07-13 10:39:32.679574] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.359 [2024-07-13 10:39:32.679712] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000053 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.359 [2024-07-13 10:39:32.679728] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.359 [2024-07-13 10:39:32.679860] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:80000053 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.359 [2024-07-13 10:39:32.679880] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:16.359 #42 NEW cov: 11837 ft: 15044 corp: 28/755b lim: 35 exec/s: 42 rss: 70Mb L: 25/35 MS: 1 EraseBytes- 00:08:16.359 [2024-07-13 10:39:32.719789] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:0000001a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.359 [2024-07-13 10:39:32.719815] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.359 [2024-07-13 10:39:32.719954] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:0000001d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.359 [2024-07-13 10:39:32.719972] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.359 [2024-07-13 10:39:32.720097] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:0000001d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.359 [2024-07-13 10:39:32.720116] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:16.359 [2024-07-13 10:39:32.720239] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:0000001d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.359 [2024-07-13 10:39:32.720256] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:16.359 #43 NEW cov: 11837 ft: 15055 corp: 29/787b lim: 35 exec/s: 43 rss: 70Mb L: 32/35 MS: 1 ShuffleBytes- 00:08:16.617 [2024-07-13 10:39:32.759922] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:0000001a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.617 [2024-07-13 10:39:32.759949] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.617 [2024-07-13 10:39:32.760086] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:0000001d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.617 [2024-07-13 10:39:32.760104] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.617 [2024-07-13 10:39:32.760234] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:000000bc SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.617 [2024-07-13 10:39:32.760252] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:16.617 [2024-07-13 10:39:32.760385] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:0000001d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.617 [2024-07-13 10:39:32.760404] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:16.617 #44 NEW cov: 11837 ft: 15084 corp: 30/819b lim: 35 exec/s: 44 rss: 70Mb L: 32/35 MS: 1 CMP- DE: "\352X\255)\274h)\000"- 00:08:16.617 [2024-07-13 10:39:32.800905] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:0000001d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.617 [2024-07-13 10:39:32.800933] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.617 [2024-07-13 10:39:32.801074] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:0000001a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.617 [2024-07-13 10:39:32.801092] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.617 [2024-07-13 10:39:32.801219] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:0000001d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.617 [2024-07-13 10:39:32.801237] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:16.617 [2024-07-13 10:39:32.801360] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:0000001d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.617 [2024-07-13 10:39:32.801375] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:16.617 [2024-07-13 10:39:32.801508] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:8 cdw10:0000001d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.617 [2024-07-13 10:39:32.801523] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:16.617 #45 NEW cov: 11837 ft: 15095 corp: 31/854b lim: 35 exec/s: 45 rss: 70Mb L: 35/35 MS: 1 ChangeByte- 00:08:16.617 [2024-07-13 10:39:32.839793] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:0000001a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.617 [2024-07-13 10:39:32.839820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.617 [2024-07-13 10:39:32.839947] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000e2 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.617 [2024-07-13 10:39:32.839973] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.617 #46 NEW cov: 11837 ft: 15102 corp: 32/870b lim: 35 exec/s: 46 rss: 70Mb L: 16/35 MS: 1 ChangeBit- 00:08:16.617 [2024-07-13 10:39:32.890789] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000053 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.617 [2024-07-13 10:39:32.890818] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.617 [2024-07-13 10:39:32.890946] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000053 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.617 [2024-07-13 10:39:32.890963] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.617 [2024-07-13 10:39:32.891092] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000053 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.617 [2024-07-13 10:39:32.891110] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:16.617 [2024-07-13 10:39:32.891239] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000053 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.617 [2024-07-13 10:39:32.891256] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:16.617 #47 NEW cov: 11837 ft: 15175 corp: 33/904b lim: 35 exec/s: 47 rss: 70Mb L: 34/35 MS: 1 ChangeBinInt- 00:08:16.617 [2024-07-13 10:39:32.940122] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:0000008c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.617 [2024-07-13 10:39:32.940151] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.617 #48 NEW cov: 11837 ft: 15203 corp: 34/916b lim: 35 exec/s: 48 rss: 70Mb L: 12/35 MS: 1 InsertByte- 00:08:16.617 [2024-07-13 10:39:32.980927] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:0000001a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.617 [2024-07-13 10:39:32.980955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.617 [2024-07-13 10:39:32.981086] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:0000001d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.617 [2024-07-13 10:39:32.981103] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.617 [2024-07-13 10:39:32.981228] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:000000bc SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.617 [2024-07-13 10:39:32.981245] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:16.617 [2024-07-13 10:39:32.981374] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:0000001d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.617 [2024-07-13 10:39:32.981390] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:16.617 [2024-07-13 10:39:32.981529] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:8 cdw10:0000001d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.617 [2024-07-13 10:39:32.981545] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:16.875 #49 NEW cov: 11837 ft: 15213 corp: 35/951b lim: 35 exec/s: 49 rss: 70Mb L: 35/35 MS: 1 PersAutoDict- DE: "\352X\255)\274h)\000"- 00:08:16.875 [2024-07-13 10:39:33.021044] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:0000001a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.875 [2024-07-13 10:39:33.021072] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.875 [2024-07-13 10:39:33.021179] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:0000001d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.875 [2024-07-13 10:39:33.021198] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.875 [2024-07-13 10:39:33.021331] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:0000001d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.875 [2024-07-13 10:39:33.021348] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:16.875 [2024-07-13 10:39:33.021484] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:0000001d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.875 [2024-07-13 10:39:33.021503] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:16.875 [2024-07-13 10:39:33.021639] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:8 cdw10:0000001d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.875 [2024-07-13 10:39:33.021658] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:16.875 #50 NEW cov: 11837 ft: 15221 corp: 36/986b lim: 35 exec/s: 25 rss: 70Mb L: 35/35 MS: 1 ChangeBit- 00:08:16.875 #50 DONE cov: 11837 ft: 15221 corp: 36/986b lim: 35 exec/s: 25 rss: 70Mb 00:08:16.875 ###### Recommended dictionary. ###### 00:08:16.875 "\352X\255)\274h)\000" # Uses: 1 00:08:16.875 ###### End of recommended dictionary. ###### 00:08:16.875 Done 50 runs in 2 second(s) 00:08:16.875 10:39:33 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_14.conf 00:08:16.875 10:39:33 -- ../common.sh@72 -- # (( i++ )) 00:08:16.875 10:39:33 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:16.875 10:39:33 -- ../common.sh@73 -- # start_llvm_fuzz 15 1 0x1 00:08:16.875 10:39:33 -- nvmf/run.sh@23 -- # local fuzzer_type=15 00:08:16.875 10:39:33 -- nvmf/run.sh@24 -- # local timen=1 00:08:16.875 10:39:33 -- nvmf/run.sh@25 -- # local core=0x1 00:08:16.875 10:39:33 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:08:16.875 10:39:33 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_15.conf 00:08:16.875 10:39:33 -- nvmf/run.sh@29 -- # printf %02d 15 00:08:16.875 10:39:33 -- nvmf/run.sh@29 -- # port=4415 00:08:16.875 10:39:33 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:08:16.875 10:39:33 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4415' 00:08:16.875 10:39:33 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4415"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:16.875 10:39:33 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4415' -c /tmp/fuzz_json_15.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 -Z 15 -r /var/tmp/spdk15.sock 00:08:16.875 [2024-07-13 10:39:33.181259] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:08:16.875 [2024-07-13 10:39:33.181329] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1996056 ] 00:08:16.875 EAL: No free 2048 kB hugepages reported on node 1 00:08:17.134 [2024-07-13 10:39:33.355349] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:17.134 [2024-07-13 10:39:33.375068] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:17.134 [2024-07-13 10:39:33.375207] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:17.134 [2024-07-13 10:39:33.426860] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:17.134 [2024-07-13 10:39:33.443117] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4415 *** 00:08:17.134 INFO: Running with entropic power schedule (0xFF, 100). 00:08:17.134 INFO: Seed: 1110521415 00:08:17.134 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x26ac74c, 0x26ffca9), 00:08:17.134 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x26ffcb0,0x2c35280), 00:08:17.134 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:08:17.134 INFO: A corpus is not provided, starting from an empty corpus 00:08:17.134 #2 INITED exec/s: 0 rss: 60Mb 00:08:17.134 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:17.134 This may also happen if the target rejected all inputs we tried so far 00:08:17.651 NEW_FUNC[1/657]: 0x4b3c80 in fuzz_admin_get_features_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:460 00:08:17.651 NEW_FUNC[2/657]: 0x4d3ae0 in feat_write_atomicity /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:340 00:08:17.651 #9 NEW cov: 11377 ft: 11378 corp: 2/14b lim: 35 exec/s: 0 rss: 68Mb L: 13/13 MS: 2 ShuffleBytes-InsertRepeatedBytes- 00:08:17.651 [2024-07-13 10:39:33.834470] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000012 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.651 [2024-07-13 10:39:33.834522] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.651 [2024-07-13 10:39:33.834666] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000012 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.651 [2024-07-13 10:39:33.834687] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.651 [2024-07-13 10:39:33.834832] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.651 [2024-07-13 10:39:33.834854] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:17.651 NEW_FUNC[1/14]: 0x16d0840 in spdk_nvme_print_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_qpair.c:263 00:08:17.651 NEW_FUNC[2/14]: 0x16d0a80 in nvme_admin_qpair_print_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_qpair.c:202 00:08:17.651 #10 NEW cov: 11618 ft: 12676 corp: 3/45b lim: 35 exec/s: 0 rss: 68Mb L: 31/31 MS: 1 InsertRepeatedBytes- 00:08:17.651 #16 NEW cov: 11624 ft: 12843 corp: 4/58b lim: 35 exec/s: 0 rss: 68Mb L: 13/31 MS: 1 ChangeBinInt- 00:08:17.651 #17 NEW cov: 11709 ft: 13187 corp: 5/71b lim: 35 exec/s: 0 rss: 68Mb L: 13/31 MS: 1 CMP- DE: "\036\000\000\000"- 00:08:17.651 [2024-07-13 10:39:33.984656] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.651 [2024-07-13 10:39:33.984690] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.651 [2024-07-13 10:39:33.984838] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.651 [2024-07-13 10:39:33.984858] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.651 [2024-07-13 10:39:33.984993] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.651 [2024-07-13 10:39:33.985009] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:17.651 #18 NEW cov: 11709 ft: 13252 corp: 6/104b lim: 35 exec/s: 0 rss: 68Mb L: 33/33 MS: 1 InsertRepeatedBytes- 00:08:17.651 [2024-07-13 10:39:34.024316] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000012 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.651 [2024-07-13 10:39:34.024346] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.651 [2024-07-13 10:39:34.024476] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000012 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.651 [2024-07-13 10:39:34.024498] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.651 [2024-07-13 10:39:34.024625] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.651 [2024-07-13 10:39:34.024643] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:17.909 #19 NEW cov: 11709 ft: 13338 corp: 7/135b lim: 35 exec/s: 0 rss: 68Mb L: 31/33 MS: 1 ChangeBinInt- 00:08:17.909 [2024-07-13 10:39:34.084423] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.909 [2024-07-13 10:39:34.084455] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.909 #20 NEW cov: 11709 ft: 13745 corp: 8/153b lim: 35 exec/s: 0 rss: 68Mb L: 18/33 MS: 1 CrossOver- 00:08:17.909 [2024-07-13 10:39:34.124503] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000000ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.909 [2024-07-13 10:39:34.124531] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.909 #21 NEW cov: 11709 ft: 13774 corp: 9/170b lim: 35 exec/s: 0 rss: 69Mb L: 17/33 MS: 1 CopyPart- 00:08:17.909 [2024-07-13 10:39:34.164700] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.909 [2024-07-13 10:39:34.164726] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.909 NEW_FUNC[1/1]: 0x4d3fb0 in feat_async_event_cfg /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:346 00:08:17.910 #25 NEW cov: 11813 ft: 13896 corp: 10/185b lim: 35 exec/s: 0 rss: 69Mb L: 15/33 MS: 4 InsertByte-EraseBytes-ChangeBit-InsertRepeatedBytes- 00:08:17.910 [2024-07-13 10:39:34.204791] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.910 [2024-07-13 10:39:34.204818] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.910 #26 NEW cov: 11813 ft: 13922 corp: 11/203b lim: 35 exec/s: 0 rss: 69Mb L: 18/33 MS: 1 ChangeBinInt- 00:08:17.910 [2024-07-13 10:39:34.244851] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000012 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.910 [2024-07-13 10:39:34.244880] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.910 [2024-07-13 10:39:34.245017] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.910 [2024-07-13 10:39:34.245035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.910 #27 NEW cov: 11813 ft: 14110 corp: 12/221b lim: 35 exec/s: 0 rss: 69Mb L: 18/33 MS: 1 ChangeBinInt- 00:08:18.168 #28 NEW cov: 11813 ft: 14159 corp: 13/234b lim: 35 exec/s: 0 rss: 69Mb L: 13/33 MS: 1 ChangeByte- 00:08:18.168 [2024-07-13 10:39:34.324935] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.168 [2024-07-13 10:39:34.324962] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.168 #29 NEW cov: 11813 ft: 14187 corp: 14/252b lim: 35 exec/s: 0 rss: 69Mb L: 18/33 MS: 1 ChangeBinInt- 00:08:18.168 [2024-07-13 10:39:34.365457] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.168 [2024-07-13 10:39:34.365482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.168 [2024-07-13 10:39:34.365614] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.168 [2024-07-13 10:39:34.365634] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:18.168 [2024-07-13 10:39:34.365761] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.168 [2024-07-13 10:39:34.365779] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:18.168 NEW_FUNC[1/1]: 0x197bcf0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:18.168 #30 NEW cov: 11836 ft: 14234 corp: 15/285b lim: 35 exec/s: 0 rss: 69Mb L: 33/33 MS: 1 InsertRepeatedBytes- 00:08:18.168 [2024-07-13 10:39:34.416030] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000012 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.168 [2024-07-13 10:39:34.416056] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.168 [2024-07-13 10:39:34.416186] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000012 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.168 [2024-07-13 10:39:34.416203] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:18.168 [2024-07-13 10:39:34.416337] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.168 [2024-07-13 10:39:34.416354] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:18.168 [2024-07-13 10:39:34.416487] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.168 [2024-07-13 10:39:34.416506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:18.168 #31 NEW cov: 11836 ft: 14458 corp: 16/320b lim: 35 exec/s: 0 rss: 69Mb L: 35/35 MS: 1 PersAutoDict- DE: "\036\000\000\000"- 00:08:18.168 #32 NEW cov: 11836 ft: 14479 corp: 17/333b lim: 35 exec/s: 0 rss: 69Mb L: 13/35 MS: 1 ChangeByte- 00:08:18.168 [2024-07-13 10:39:34.495988] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000007d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.168 [2024-07-13 10:39:34.496013] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.168 [2024-07-13 10:39:34.496144] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000012 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.168 [2024-07-13 10:39:34.496160] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.168 [2024-07-13 10:39:34.496283] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000012 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.168 [2024-07-13 10:39:34.496298] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:18.168 [2024-07-13 10:39:34.496430] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.168 [2024-07-13 10:39:34.496449] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:18.168 #33 NEW cov: 11836 ft: 14515 corp: 18/364b lim: 35 exec/s: 33 rss: 69Mb L: 31/35 MS: 1 ChangeByte- 00:08:18.427 #34 NEW cov: 11836 ft: 14524 corp: 19/377b lim: 35 exec/s: 34 rss: 69Mb L: 13/35 MS: 1 ChangeBinInt- 00:08:18.427 [2024-07-13 10:39:34.576135] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.427 [2024-07-13 10:39:34.576161] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.427 [2024-07-13 10:39:34.576308] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.427 [2024-07-13 10:39:34.576325] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:18.427 #35 NEW cov: 11836 ft: 14611 corp: 20/399b lim: 35 exec/s: 35 rss: 69Mb L: 22/35 MS: 1 PersAutoDict- DE: "\036\000\000\000"- 00:08:18.427 [2024-07-13 10:39:34.616388] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.427 [2024-07-13 10:39:34.616416] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.427 [2024-07-13 10:39:34.616531] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.427 [2024-07-13 10:39:34.616560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:18.427 [2024-07-13 10:39:34.616700] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.427 [2024-07-13 10:39:34.616717] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:18.427 #36 NEW cov: 11836 ft: 14622 corp: 21/432b lim: 35 exec/s: 36 rss: 70Mb L: 33/35 MS: 1 ChangeByte- 00:08:18.427 #37 NEW cov: 11836 ft: 14650 corp: 22/445b lim: 35 exec/s: 37 rss: 70Mb L: 13/35 MS: 1 EraseBytes- 00:08:18.427 [2024-07-13 10:39:34.706236] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.427 [2024-07-13 10:39:34.706264] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.427 #38 NEW cov: 11836 ft: 14745 corp: 23/462b lim: 35 exec/s: 38 rss: 70Mb L: 17/35 MS: 1 EraseBytes- 00:08:18.427 [2024-07-13 10:39:34.746349] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.427 [2024-07-13 10:39:34.746374] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.427 [2024-07-13 10:39:34.746515] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.427 [2024-07-13 10:39:34.746533] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:18.427 [2024-07-13 10:39:34.746660] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.427 [2024-07-13 10:39:34.746677] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:18.427 #39 NEW cov: 11836 ft: 14818 corp: 24/495b lim: 35 exec/s: 39 rss: 70Mb L: 33/35 MS: 1 PersAutoDict- DE: "\036\000\000\000"- 00:08:18.427 [2024-07-13 10:39:34.786401] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.427 [2024-07-13 10:39:34.786429] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.427 #40 NEW cov: 11836 ft: 14886 corp: 25/509b lim: 35 exec/s: 40 rss: 70Mb L: 14/35 MS: 1 InsertByte- 00:08:18.686 [2024-07-13 10:39:34.827064] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.686 [2024-07-13 10:39:34.827091] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.686 [2024-07-13 10:39:34.827233] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.686 [2024-07-13 10:39:34.827253] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:18.686 [2024-07-13 10:39:34.827393] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.686 [2024-07-13 10:39:34.827409] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:18.686 #41 NEW cov: 11836 ft: 14904 corp: 26/542b lim: 35 exec/s: 41 rss: 70Mb L: 33/35 MS: 1 CopyPart- 00:08:18.686 [2024-07-13 10:39:34.877162] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.686 [2024-07-13 10:39:34.877189] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.686 [2024-07-13 10:39:34.877322] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.686 [2024-07-13 10:39:34.877340] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:18.686 [2024-07-13 10:39:34.877476] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.686 [2024-07-13 10:39:34.877492] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:18.686 #42 NEW cov: 11836 ft: 14924 corp: 27/575b lim: 35 exec/s: 42 rss: 70Mb L: 33/35 MS: 1 ChangeByte- 00:08:18.686 [2024-07-13 10:39:34.926524] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.686 [2024-07-13 10:39:34.926553] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.686 #43 NEW cov: 11836 ft: 14933 corp: 28/593b lim: 35 exec/s: 43 rss: 70Mb L: 18/35 MS: 1 InsertByte- 00:08:18.686 [2024-07-13 10:39:34.977163] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.686 [2024-07-13 10:39:34.977192] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.686 [2024-07-13 10:39:34.977327] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.686 [2024-07-13 10:39:34.977348] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:18.686 [2024-07-13 10:39:34.977492] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.686 [2024-07-13 10:39:34.977510] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:18.686 #44 NEW cov: 11836 ft: 14954 corp: 29/626b lim: 35 exec/s: 44 rss: 70Mb L: 33/35 MS: 1 ChangeBit- 00:08:18.686 #45 NEW cov: 11836 ft: 14961 corp: 30/639b lim: 35 exec/s: 45 rss: 70Mb L: 13/35 MS: 1 ShuffleBytes- 00:08:18.686 [2024-07-13 10:39:35.066813] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000043 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.686 [2024-07-13 10:39:35.066842] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.686 [2024-07-13 10:39:35.066973] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.686 [2024-07-13 10:39:35.066992] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.945 #46 NEW cov: 11836 ft: 14967 corp: 31/656b lim: 35 exec/s: 46 rss: 70Mb L: 17/35 MS: 1 CrossOver- 00:08:18.945 #47 NEW cov: 11836 ft: 14971 corp: 32/669b lim: 35 exec/s: 47 rss: 70Mb L: 13/35 MS: 1 ChangeByte- 00:08:18.945 [2024-07-13 10:39:35.147635] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.945 [2024-07-13 10:39:35.147665] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.945 #48 NEW cov: 11836 ft: 14989 corp: 33/689b lim: 35 exec/s: 48 rss: 70Mb L: 20/35 MS: 1 InsertRepeatedBytes- 00:08:18.945 #49 NEW cov: 11836 ft: 14997 corp: 34/702b lim: 35 exec/s: 49 rss: 70Mb L: 13/35 MS: 1 ChangeByte- 00:08:18.945 #50 NEW cov: 11836 ft: 15011 corp: 35/711b lim: 35 exec/s: 50 rss: 70Mb L: 9/35 MS: 1 EraseBytes- 00:08:18.945 [2024-07-13 10:39:35.278368] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.945 [2024-07-13 10:39:35.278399] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.945 [2024-07-13 10:39:35.278529] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.945 [2024-07-13 10:39:35.278547] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:18.945 [2024-07-13 10:39:35.278680] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.945 [2024-07-13 10:39:35.278698] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:18.945 #51 NEW cov: 11836 ft: 15017 corp: 36/740b lim: 35 exec/s: 51 rss: 70Mb L: 29/35 MS: 1 InsertRepeatedBytes- 00:08:19.204 #52 NEW cov: 11836 ft: 15018 corp: 37/753b lim: 35 exec/s: 52 rss: 70Mb L: 13/35 MS: 1 ChangeBit- 00:08:19.204 [2024-07-13 10:39:35.368284] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.204 [2024-07-13 10:39:35.368313] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:19.204 #53 NEW cov: 11836 ft: 15038 corp: 38/767b lim: 35 exec/s: 53 rss: 70Mb L: 14/35 MS: 1 InsertByte- 00:08:19.204 [2024-07-13 10:39:35.408489] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000007d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.204 [2024-07-13 10:39:35.408518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.204 [2024-07-13 10:39:35.408644] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000012 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.204 [2024-07-13 10:39:35.408662] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:19.204 [2024-07-13 10:39:35.408786] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000012 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.204 [2024-07-13 10:39:35.408803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:19.204 [2024-07-13 10:39:35.408927] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.204 [2024-07-13 10:39:35.408944] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:19.204 [2024-07-13 10:39:35.409070] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.204 [2024-07-13 10:39:35.409089] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:19.204 #54 NEW cov: 11836 ft: 15099 corp: 39/802b lim: 35 exec/s: 54 rss: 70Mb L: 35/35 MS: 1 PersAutoDict- DE: "\036\000\000\000"- 00:08:19.204 [2024-07-13 10:39:35.458911] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.204 [2024-07-13 10:39:35.458942] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:19.204 [2024-07-13 10:39:35.459039] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.204 [2024-07-13 10:39:35.459056] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:19.204 [2024-07-13 10:39:35.459191] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.204 [2024-07-13 10:39:35.459209] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:19.204 #55 NEW cov: 11836 ft: 15104 corp: 40/836b lim: 35 exec/s: 55 rss: 70Mb L: 34/35 MS: 1 InsertByte- 00:08:19.204 [2024-07-13 10:39:35.498380] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:0000019c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.204 [2024-07-13 10:39:35.498406] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:19.204 #56 NEW cov: 11836 ft: 15120 corp: 41/850b lim: 35 exec/s: 28 rss: 70Mb L: 14/35 MS: 1 InsertByte- 00:08:19.204 #56 DONE cov: 11836 ft: 15120 corp: 41/850b lim: 35 exec/s: 28 rss: 70Mb 00:08:19.204 ###### Recommended dictionary. ###### 00:08:19.204 "\036\000\000\000" # Uses: 4 00:08:19.204 ###### End of recommended dictionary. ###### 00:08:19.204 Done 56 runs in 2 second(s) 00:08:19.463 10:39:35 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_15.conf 00:08:19.463 10:39:35 -- ../common.sh@72 -- # (( i++ )) 00:08:19.463 10:39:35 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:19.463 10:39:35 -- ../common.sh@73 -- # start_llvm_fuzz 16 1 0x1 00:08:19.463 10:39:35 -- nvmf/run.sh@23 -- # local fuzzer_type=16 00:08:19.463 10:39:35 -- nvmf/run.sh@24 -- # local timen=1 00:08:19.463 10:39:35 -- nvmf/run.sh@25 -- # local core=0x1 00:08:19.463 10:39:35 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:08:19.463 10:39:35 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_16.conf 00:08:19.463 10:39:35 -- nvmf/run.sh@29 -- # printf %02d 16 00:08:19.463 10:39:35 -- nvmf/run.sh@29 -- # port=4416 00:08:19.463 10:39:35 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:08:19.463 10:39:35 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4416' 00:08:19.463 10:39:35 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4416"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:19.463 10:39:35 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4416' -c /tmp/fuzz_json_16.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 -Z 16 -r /var/tmp/spdk16.sock 00:08:19.463 [2024-07-13 10:39:35.654969] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:08:19.463 [2024-07-13 10:39:35.655022] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1996593 ] 00:08:19.463 EAL: No free 2048 kB hugepages reported on node 1 00:08:19.463 [2024-07-13 10:39:35.827208] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:19.463 [2024-07-13 10:39:35.846937] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:19.463 [2024-07-13 10:39:35.847079] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:19.722 [2024-07-13 10:39:35.898644] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:19.722 [2024-07-13 10:39:35.914910] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4416 *** 00:08:19.722 INFO: Running with entropic power schedule (0xFF, 100). 00:08:19.722 INFO: Seed: 3583523339 00:08:19.722 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x26ac74c, 0x26ffca9), 00:08:19.722 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x26ffcb0,0x2c35280), 00:08:19.722 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:08:19.722 INFO: A corpus is not provided, starting from an empty corpus 00:08:19.722 #2 INITED exec/s: 0 rss: 60Mb 00:08:19.722 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:19.722 This may also happen if the target rejected all inputs we tried so far 00:08:19.722 [2024-07-13 10:39:35.960199] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1880844493789993498 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.722 [2024-07-13 10:39:35.960230] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.722 [2024-07-13 10:39:35.960273] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:1880844493789993498 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.722 [2024-07-13 10:39:35.960287] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.722 [2024-07-13 10:39:35.960341] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:1880844493789993498 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.722 [2024-07-13 10:39:35.960357] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.981 NEW_FUNC[1/671]: 0x4b5130 in fuzz_nvm_read_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:519 00:08:19.981 NEW_FUNC[2/671]: 0x4dac50 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:19.981 #18 NEW cov: 11594 ft: 11595 corp: 2/78b lim: 105 exec/s: 0 rss: 68Mb L: 77/77 MS: 1 InsertRepeatedBytes- 00:08:19.981 [2024-07-13 10:39:36.270898] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1880844493789993498 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.981 [2024-07-13 10:39:36.270932] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.981 [2024-07-13 10:39:36.270993] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:1880844493789993498 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.981 [2024-07-13 10:39:36.271009] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.981 [2024-07-13 10:39:36.271064] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:1880844493789993498 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.981 [2024-07-13 10:39:36.271078] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.981 #19 NEW cov: 11707 ft: 12038 corp: 3/155b lim: 105 exec/s: 0 rss: 68Mb L: 77/77 MS: 1 ChangeByte- 00:08:19.981 [2024-07-13 10:39:36.311108] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1880844493789993498 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.981 [2024-07-13 10:39:36.311137] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.981 [2024-07-13 10:39:36.311173] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:1880844493789993498 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.981 [2024-07-13 10:39:36.311189] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.981 [2024-07-13 10:39:36.311243] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:1880844493789993498 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.981 [2024-07-13 10:39:36.311258] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.981 [2024-07-13 10:39:36.311314] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:1880844493789993498 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.981 [2024-07-13 10:39:36.311329] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:19.981 #20 NEW cov: 11713 ft: 12836 corp: 4/245b lim: 105 exec/s: 0 rss: 68Mb L: 90/90 MS: 1 InsertRepeatedBytes- 00:08:19.981 [2024-07-13 10:39:36.351076] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1880844493789993498 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.981 [2024-07-13 10:39:36.351103] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.981 [2024-07-13 10:39:36.351156] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:1880844493789993498 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.981 [2024-07-13 10:39:36.351172] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.981 [2024-07-13 10:39:36.351226] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:1880844493789993498 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:19.981 [2024-07-13 10:39:36.351242] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:20.241 #21 NEW cov: 11798 ft: 13056 corp: 5/322b lim: 105 exec/s: 0 rss: 68Mb L: 77/90 MS: 1 CopyPart- 00:08:20.241 [2024-07-13 10:39:36.391317] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1880844493789993498 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.241 [2024-07-13 10:39:36.391343] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.241 [2024-07-13 10:39:36.391391] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:1880844493789993498 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.241 [2024-07-13 10:39:36.391406] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.241 [2024-07-13 10:39:36.391461] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:1880844493789993498 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.241 [2024-07-13 10:39:36.391477] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:20.241 [2024-07-13 10:39:36.391528] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:1880844493789993498 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.241 [2024-07-13 10:39:36.391541] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:20.241 #22 NEW cov: 11798 ft: 13128 corp: 6/412b lim: 105 exec/s: 0 rss: 68Mb L: 90/90 MS: 1 ChangeBinInt- 00:08:20.241 [2024-07-13 10:39:36.431301] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1880844493789993498 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.241 [2024-07-13 10:39:36.431329] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.241 [2024-07-13 10:39:36.431367] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:1880844493789993498 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.241 [2024-07-13 10:39:36.431383] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.241 [2024-07-13 10:39:36.431434] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:1880844493789993498 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.241 [2024-07-13 10:39:36.431453] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:20.241 #23 NEW cov: 11798 ft: 13180 corp: 7/490b lim: 105 exec/s: 0 rss: 69Mb L: 78/90 MS: 1 InsertByte- 00:08:20.241 [2024-07-13 10:39:36.471320] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1880844493789993498 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.241 [2024-07-13 10:39:36.471350] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.241 [2024-07-13 10:39:36.471400] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:1880844493789993498 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.241 [2024-07-13 10:39:36.471416] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.241 [2024-07-13 10:39:36.471465] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:1880844493789993498 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.241 [2024-07-13 10:39:36.471480] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:20.241 #24 NEW cov: 11798 ft: 13292 corp: 8/568b lim: 105 exec/s: 0 rss: 69Mb L: 78/90 MS: 1 ShuffleBytes- 00:08:20.241 [2024-07-13 10:39:36.511358] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1880844493789993498 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.241 [2024-07-13 10:39:36.511386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.241 [2024-07-13 10:39:36.511429] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:1880844493789993498 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.241 [2024-07-13 10:39:36.511447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.241 #25 NEW cov: 11798 ft: 13713 corp: 9/613b lim: 105 exec/s: 0 rss: 69Mb L: 45/90 MS: 1 CrossOver- 00:08:20.241 [2024-07-13 10:39:36.551413] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1880844493789993498 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.241 [2024-07-13 10:39:36.551447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.241 #26 NEW cov: 11798 ft: 14206 corp: 10/640b lim: 105 exec/s: 0 rss: 69Mb L: 27/90 MS: 1 EraseBytes- 00:08:20.241 [2024-07-13 10:39:36.591637] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1880844493789993498 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.241 [2024-07-13 10:39:36.591664] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.241 [2024-07-13 10:39:36.591716] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:1880844493789993498 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.241 [2024-07-13 10:39:36.591731] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.241 #27 NEW cov: 11798 ft: 14274 corp: 11/685b lim: 105 exec/s: 0 rss: 69Mb L: 45/90 MS: 1 CrossOver- 00:08:20.501 [2024-07-13 10:39:36.631666] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1880844493789993498 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.501 [2024-07-13 10:39:36.631694] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.501 #28 NEW cov: 11798 ft: 14330 corp: 12/712b lim: 105 exec/s: 0 rss: 69Mb L: 27/90 MS: 1 CopyPart- 00:08:20.501 [2024-07-13 10:39:36.672154] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1880844493789993498 len:58854 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.501 [2024-07-13 10:39:36.672182] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.501 [2024-07-13 10:39:36.672221] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:1880844493789993498 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.501 [2024-07-13 10:39:36.672237] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.501 [2024-07-13 10:39:36.672291] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:1880844493789993498 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.501 [2024-07-13 10:39:36.672307] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:20.501 [2024-07-13 10:39:36.672361] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:1880844493789993498 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.501 [2024-07-13 10:39:36.672375] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:20.501 #29 NEW cov: 11798 ft: 14358 corp: 13/802b lim: 105 exec/s: 0 rss: 69Mb L: 90/90 MS: 1 ChangeBinInt- 00:08:20.501 [2024-07-13 10:39:36.712213] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1880844493789993498 len:58854 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.501 [2024-07-13 10:39:36.712240] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.501 [2024-07-13 10:39:36.712285] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:1880844493789993498 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.501 [2024-07-13 10:39:36.712300] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.501 [2024-07-13 10:39:36.712355] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:1880844493789993498 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.501 [2024-07-13 10:39:36.712369] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:20.501 [2024-07-13 10:39:36.712422] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:1880844493789993498 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.501 [2024-07-13 10:39:36.712436] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:20.501 #30 NEW cov: 11798 ft: 14383 corp: 14/893b lim: 105 exec/s: 0 rss: 69Mb L: 91/91 MS: 1 InsertByte- 00:08:20.501 [2024-07-13 10:39:36.752231] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1880844493789993498 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.501 [2024-07-13 10:39:36.752258] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.501 [2024-07-13 10:39:36.752295] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:1880844493789993498 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.501 [2024-07-13 10:39:36.752311] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.501 [2024-07-13 10:39:36.752365] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:1880844493789993498 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.501 [2024-07-13 10:39:36.752380] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:20.501 #31 NEW cov: 11798 ft: 14424 corp: 15/970b lim: 105 exec/s: 0 rss: 69Mb L: 77/91 MS: 1 ChangeByte- 00:08:20.501 [2024-07-13 10:39:36.792480] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1880844493789993498 len:58854 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.501 [2024-07-13 10:39:36.792507] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.501 [2024-07-13 10:39:36.792547] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:1880844493789993498 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.501 [2024-07-13 10:39:36.792562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.501 [2024-07-13 10:39:36.792615] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:1880844493789993498 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.501 [2024-07-13 10:39:36.792631] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:20.501 [2024-07-13 10:39:36.792683] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:1880844493789993498 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.501 [2024-07-13 10:39:36.792697] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:20.501 #32 NEW cov: 11798 ft: 14433 corp: 16/1060b lim: 105 exec/s: 0 rss: 69Mb L: 90/91 MS: 1 ChangeBinInt- 00:08:20.501 [2024-07-13 10:39:36.832324] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1880844493789993498 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.502 [2024-07-13 10:39:36.832351] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.502 [2024-07-13 10:39:36.832388] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:1880844493789993498 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.502 [2024-07-13 10:39:36.832401] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.502 NEW_FUNC[1/1]: 0x197bcf0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:20.502 #33 NEW cov: 11821 ft: 14453 corp: 17/1106b lim: 105 exec/s: 0 rss: 69Mb L: 46/91 MS: 1 CrossOver- 00:08:20.502 [2024-07-13 10:39:36.872577] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1880844493789993498 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.502 [2024-07-13 10:39:36.872605] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.502 [2024-07-13 10:39:36.872644] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:1880844493789993498 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.502 [2024-07-13 10:39:36.872659] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.502 [2024-07-13 10:39:36.872711] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:1880844493789993498 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.502 [2024-07-13 10:39:36.872726] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:20.762 #34 NEW cov: 11821 ft: 14474 corp: 18/1184b lim: 105 exec/s: 0 rss: 70Mb L: 78/91 MS: 1 ChangeBinInt- 00:08:20.762 [2024-07-13 10:39:36.912685] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1880844493789993498 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.762 [2024-07-13 10:39:36.912713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.762 [2024-07-13 10:39:36.912767] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:1880844493789993498 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.762 [2024-07-13 10:39:36.912783] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.762 [2024-07-13 10:39:36.912835] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:1880844493789993498 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.762 [2024-07-13 10:39:36.912854] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:20.762 #35 NEW cov: 11821 ft: 14494 corp: 19/1263b lim: 105 exec/s: 0 rss: 70Mb L: 79/91 MS: 1 InsertByte- 00:08:20.762 [2024-07-13 10:39:36.952802] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1880844493789993498 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.762 [2024-07-13 10:39:36.952829] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.762 [2024-07-13 10:39:36.952865] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:1880844493789993498 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.762 [2024-07-13 10:39:36.952880] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.762 [2024-07-13 10:39:36.952933] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:1880844493789993498 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.762 [2024-07-13 10:39:36.952948] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:20.762 #36 NEW cov: 11821 ft: 14501 corp: 20/1341b lim: 105 exec/s: 36 rss: 70Mb L: 78/91 MS: 1 ShuffleBytes- 00:08:20.762 [2024-07-13 10:39:36.992805] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1880844493789993498 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.762 [2024-07-13 10:39:36.992833] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.762 [2024-07-13 10:39:36.992874] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:1880844493789993498 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.762 [2024-07-13 10:39:36.992889] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.762 #37 NEW cov: 11821 ft: 14592 corp: 21/1387b lim: 105 exec/s: 37 rss: 70Mb L: 46/91 MS: 1 InsertByte- 00:08:20.762 [2024-07-13 10:39:37.033059] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1880844493789993498 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.762 [2024-07-13 10:39:37.033087] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.762 [2024-07-13 10:39:37.033123] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:1880844493789993498 len:9243 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.762 [2024-07-13 10:39:37.033139] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.762 [2024-07-13 10:39:37.033191] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:1880844493789993498 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.762 [2024-07-13 10:39:37.033206] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:20.762 #38 NEW cov: 11821 ft: 14613 corp: 22/1466b lim: 105 exec/s: 38 rss: 70Mb L: 79/91 MS: 1 InsertByte- 00:08:20.762 [2024-07-13 10:39:37.063035] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1880844493789993498 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.762 [2024-07-13 10:39:37.063061] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.762 [2024-07-13 10:39:37.063113] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:1880844493789993498 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.762 [2024-07-13 10:39:37.063129] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.762 #39 NEW cov: 11821 ft: 14669 corp: 23/1519b lim: 105 exec/s: 39 rss: 70Mb L: 53/91 MS: 1 CMP- DE: "\377\377\377\377\377\377\377\003"- 00:08:20.762 [2024-07-13 10:39:37.103353] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1880844493789993498 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.762 [2024-07-13 10:39:37.103380] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.762 [2024-07-13 10:39:37.103418] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:1880844494075206170 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.762 [2024-07-13 10:39:37.103432] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.762 [2024-07-13 10:39:37.103491] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:1880844493789993498 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.762 [2024-07-13 10:39:37.103506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:20.762 [2024-07-13 10:39:37.103576] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:1880844493789993498 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.762 [2024-07-13 10:39:37.103591] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:20.762 #40 NEW cov: 11821 ft: 14698 corp: 24/1609b lim: 105 exec/s: 40 rss: 70Mb L: 90/91 MS: 1 ChangeByte- 00:08:20.762 [2024-07-13 10:39:37.143242] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1880844493789993498 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.762 [2024-07-13 10:39:37.143270] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.762 [2024-07-13 10:39:37.143307] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:1880844493789993498 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:20.762 [2024-07-13 10:39:37.143320] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.022 #41 NEW cov: 11821 ft: 14707 corp: 25/1655b lim: 105 exec/s: 41 rss: 70Mb L: 46/91 MS: 1 ChangeByte- 00:08:21.022 [2024-07-13 10:39:37.183217] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1880844493789993498 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.022 [2024-07-13 10:39:37.183246] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.022 #42 NEW cov: 11821 ft: 14748 corp: 26/1695b lim: 105 exec/s: 42 rss: 70Mb L: 40/91 MS: 1 CopyPart- 00:08:21.022 [2024-07-13 10:39:37.223707] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1880844493789993498 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.022 [2024-07-13 10:39:37.223734] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.022 [2024-07-13 10:39:37.223779] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:1880844493789993498 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.022 [2024-07-13 10:39:37.223795] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.022 [2024-07-13 10:39:37.223849] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:1932354414532434458 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.022 [2024-07-13 10:39:37.223863] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.022 [2024-07-13 10:39:37.223916] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:1880844493789993498 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.022 [2024-07-13 10:39:37.223934] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:21.022 #43 NEW cov: 11821 ft: 14754 corp: 27/1795b lim: 105 exec/s: 43 rss: 70Mb L: 100/100 MS: 1 CrossOver- 00:08:21.022 [2024-07-13 10:39:37.263596] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:16565675503761678053 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.022 [2024-07-13 10:39:37.263622] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.022 [2024-07-13 10:39:37.263667] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:1880844493789993498 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.022 [2024-07-13 10:39:37.263682] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.022 #44 NEW cov: 11821 ft: 14771 corp: 28/1840b lim: 105 exec/s: 44 rss: 70Mb L: 45/100 MS: 1 ChangeBinInt- 00:08:21.022 [2024-07-13 10:39:37.303897] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1880844493789993498 len:14650 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.022 [2024-07-13 10:39:37.303924] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.022 [2024-07-13 10:39:37.303970] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:1880844493789993498 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.022 [2024-07-13 10:39:37.303984] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.022 [2024-07-13 10:39:37.304035] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:1880844493789993498 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.022 [2024-07-13 10:39:37.304050] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.022 [2024-07-13 10:39:37.304103] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:1880844493789993498 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.022 [2024-07-13 10:39:37.304117] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:21.022 #45 NEW cov: 11821 ft: 14820 corp: 29/1941b lim: 105 exec/s: 45 rss: 70Mb L: 101/101 MS: 1 InsertRepeatedBytes- 00:08:21.022 [2024-07-13 10:39:37.343855] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:16565675503761678053 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.022 [2024-07-13 10:39:37.343882] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.022 [2024-07-13 10:39:37.343921] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:1880844493789993498 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.022 [2024-07-13 10:39:37.343935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.022 #46 NEW cov: 11821 ft: 14854 corp: 30/1986b lim: 105 exec/s: 46 rss: 70Mb L: 45/101 MS: 1 ChangeByte- 00:08:21.022 [2024-07-13 10:39:37.383937] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1880844493789993498 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.022 [2024-07-13 10:39:37.383963] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.022 [2024-07-13 10:39:37.384013] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:1880844493789998362 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.022 [2024-07-13 10:39:37.384029] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.022 #47 NEW cov: 11821 ft: 14868 corp: 31/2031b lim: 105 exec/s: 47 rss: 70Mb L: 45/101 MS: 1 ChangeBinInt- 00:08:21.282 [2024-07-13 10:39:37.424230] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1880844493789993498 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.282 [2024-07-13 10:39:37.424259] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.282 [2024-07-13 10:39:37.424295] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:1880844493789993498 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.282 [2024-07-13 10:39:37.424310] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.282 [2024-07-13 10:39:37.424362] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:1880844493789993498 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.282 [2024-07-13 10:39:37.424377] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.282 #48 NEW cov: 11821 ft: 14876 corp: 32/2109b lim: 105 exec/s: 48 rss: 70Mb L: 78/101 MS: 1 ChangeByte- 00:08:21.282 [2024-07-13 10:39:37.454154] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:16565675503761678053 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.282 [2024-07-13 10:39:37.454181] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.282 [2024-07-13 10:39:37.454231] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:1880844493789993498 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.282 [2024-07-13 10:39:37.454246] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.282 #49 NEW cov: 11821 ft: 14953 corp: 33/2154b lim: 105 exec/s: 49 rss: 70Mb L: 45/101 MS: 1 ChangeBit- 00:08:21.282 [2024-07-13 10:39:37.494477] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1880844493789993498 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.282 [2024-07-13 10:39:37.494504] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.282 [2024-07-13 10:39:37.494557] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:5627839383324461663 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.282 [2024-07-13 10:39:37.494572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.282 [2024-07-13 10:39:37.494627] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:1880844493789993498 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.282 [2024-07-13 10:39:37.494642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.282 [2024-07-13 10:39:37.494694] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:1880844493789993498 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.282 [2024-07-13 10:39:37.494708] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:21.283 #50 NEW cov: 11821 ft: 14961 corp: 34/2252b lim: 105 exec/s: 50 rss: 70Mb L: 98/101 MS: 1 CMP- DE: "\000\000\000\000\002\012_N"- 00:08:21.283 [2024-07-13 10:39:37.534455] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1880844493789993498 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.283 [2024-07-13 10:39:37.534481] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.283 [2024-07-13 10:39:37.534518] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:1880844493789993498 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.283 [2024-07-13 10:39:37.534536] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.283 [2024-07-13 10:39:37.534589] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:1880844493789993498 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.283 [2024-07-13 10:39:37.534604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.283 #51 NEW cov: 11821 ft: 14979 corp: 35/2330b lim: 105 exec/s: 51 rss: 70Mb L: 78/101 MS: 1 ChangeBit- 00:08:21.283 [2024-07-13 10:39:37.574578] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1880844493789993498 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.283 [2024-07-13 10:39:37.574605] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.283 [2024-07-13 10:39:37.574643] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:1880844493789993498 len:9243 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.283 [2024-07-13 10:39:37.574658] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.283 [2024-07-13 10:39:37.574709] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:1880844493789993498 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.283 [2024-07-13 10:39:37.574724] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.283 #52 NEW cov: 11821 ft: 15001 corp: 36/2409b lim: 105 exec/s: 52 rss: 70Mb L: 79/101 MS: 1 CopyPart- 00:08:21.283 [2024-07-13 10:39:37.614594] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1880844493789993498 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.283 [2024-07-13 10:39:37.614621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.283 [2024-07-13 10:39:37.614674] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:1880844493789993498 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.283 [2024-07-13 10:39:37.614688] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.283 #53 NEW cov: 11821 ft: 15014 corp: 37/2463b lim: 105 exec/s: 53 rss: 70Mb L: 54/101 MS: 1 InsertByte- 00:08:21.283 [2024-07-13 10:39:37.654972] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1880844493789993498 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.283 [2024-07-13 10:39:37.654999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.283 [2024-07-13 10:39:37.655046] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:1880844493789993498 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.283 [2024-07-13 10:39:37.655061] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.283 [2024-07-13 10:39:37.655110] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:1878311218999597594 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.283 [2024-07-13 10:39:37.655124] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.283 [2024-07-13 10:39:37.655175] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:1880844493789993498 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.283 [2024-07-13 10:39:37.655190] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:21.543 #54 NEW cov: 11821 ft: 15026 corp: 38/2553b lim: 105 exec/s: 54 rss: 70Mb L: 90/101 MS: 1 ChangeByte- 00:08:21.543 [2024-07-13 10:39:37.695020] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1880844493789993498 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.543 [2024-07-13 10:39:37.695047] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.543 [2024-07-13 10:39:37.695094] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:1880844493789998362 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.543 [2024-07-13 10:39:37.695124] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.543 [2024-07-13 10:39:37.695176] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:1880844493789993687 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.543 [2024-07-13 10:39:37.695191] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.543 [2024-07-13 10:39:37.695245] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:1880844493789993498 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.543 [2024-07-13 10:39:37.695260] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:21.543 #55 NEW cov: 11821 ft: 15031 corp: 39/2638b lim: 105 exec/s: 55 rss: 70Mb L: 85/101 MS: 1 CrossOver- 00:08:21.543 [2024-07-13 10:39:37.734854] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1880844493789993498 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.543 [2024-07-13 10:39:37.734881] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.543 #56 NEW cov: 11821 ft: 15037 corp: 40/2678b lim: 105 exec/s: 56 rss: 70Mb L: 40/101 MS: 1 CrossOver- 00:08:21.543 [2024-07-13 10:39:37.775283] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1880844493789993498 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.543 [2024-07-13 10:39:37.775310] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.543 [2024-07-13 10:39:37.775356] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:5627839383324461663 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.543 [2024-07-13 10:39:37.775371] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.543 [2024-07-13 10:39:37.775423] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:1880844493789993498 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.543 [2024-07-13 10:39:37.775439] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.543 [2024-07-13 10:39:37.775498] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:1880844493789993498 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.543 [2024-07-13 10:39:37.775511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:21.543 #57 NEW cov: 11821 ft: 15053 corp: 41/2776b lim: 105 exec/s: 57 rss: 70Mb L: 98/101 MS: 1 ShuffleBytes- 00:08:21.543 [2024-07-13 10:39:37.815315] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1880844493789993498 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.543 [2024-07-13 10:39:37.815342] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.543 [2024-07-13 10:39:37.815376] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:1880844493789993498 len:55770 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.543 [2024-07-13 10:39:37.815392] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.543 [2024-07-13 10:39:37.815451] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:15697817505862638041 len:55770 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.543 [2024-07-13 10:39:37.815465] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.543 #58 NEW cov: 11821 ft: 15065 corp: 42/2854b lim: 105 exec/s: 58 rss: 70Mb L: 78/101 MS: 1 InsertRepeatedBytes- 00:08:21.543 [2024-07-13 10:39:37.855440] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1880844493789993498 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.543 [2024-07-13 10:39:37.855471] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.543 [2024-07-13 10:39:37.855519] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:1880844493789993498 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.543 [2024-07-13 10:39:37.855534] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.543 [2024-07-13 10:39:37.855588] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:14178673876263027908 len:50373 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.543 [2024-07-13 10:39:37.855603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.543 #59 NEW cov: 11821 ft: 15103 corp: 43/2924b lim: 105 exec/s: 59 rss: 70Mb L: 70/101 MS: 1 InsertRepeatedBytes- 00:08:21.543 [2024-07-13 10:39:37.895657] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1901082104810838554 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.543 [2024-07-13 10:39:37.895683] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.543 [2024-07-13 10:39:37.895747] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:5627839383324461663 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.543 [2024-07-13 10:39:37.895763] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.543 [2024-07-13 10:39:37.895815] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:1880844493789993498 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.543 [2024-07-13 10:39:37.895829] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.543 [2024-07-13 10:39:37.895885] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:1880844493789993498 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.543 [2024-07-13 10:39:37.895899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:21.543 #60 NEW cov: 11821 ft: 15108 corp: 44/3022b lim: 105 exec/s: 60 rss: 70Mb L: 98/101 MS: 1 ChangeBinInt- 00:08:21.802 [2024-07-13 10:39:37.935560] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1880844493789993498 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.802 [2024-07-13 10:39:37.935587] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.802 [2024-07-13 10:39:37.935639] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:1880844493789998362 len:6683 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.802 [2024-07-13 10:39:37.935654] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.802 #61 NEW cov: 11821 ft: 15111 corp: 45/3072b lim: 105 exec/s: 30 rss: 70Mb L: 50/101 MS: 1 CrossOver- 00:08:21.802 #61 DONE cov: 11821 ft: 15111 corp: 45/3072b lim: 105 exec/s: 30 rss: 70Mb 00:08:21.802 ###### Recommended dictionary. ###### 00:08:21.802 "\377\377\377\377\377\377\377\003" # Uses: 0 00:08:21.802 "\000\000\000\000\002\012_N" # Uses: 0 00:08:21.802 ###### End of recommended dictionary. ###### 00:08:21.802 Done 61 runs in 2 second(s) 00:08:21.802 10:39:38 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_16.conf 00:08:21.802 10:39:38 -- ../common.sh@72 -- # (( i++ )) 00:08:21.802 10:39:38 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:21.802 10:39:38 -- ../common.sh@73 -- # start_llvm_fuzz 17 1 0x1 00:08:21.802 10:39:38 -- nvmf/run.sh@23 -- # local fuzzer_type=17 00:08:21.802 10:39:38 -- nvmf/run.sh@24 -- # local timen=1 00:08:21.802 10:39:38 -- nvmf/run.sh@25 -- # local core=0x1 00:08:21.802 10:39:38 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:08:21.802 10:39:38 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_17.conf 00:08:21.802 10:39:38 -- nvmf/run.sh@29 -- # printf %02d 17 00:08:21.802 10:39:38 -- nvmf/run.sh@29 -- # port=4417 00:08:21.802 10:39:38 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:08:21.802 10:39:38 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4417' 00:08:21.802 10:39:38 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4417"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:21.802 10:39:38 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4417' -c /tmp/fuzz_json_17.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 -Z 17 -r /var/tmp/spdk17.sock 00:08:21.802 [2024-07-13 10:39:38.112036] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:08:21.802 [2024-07-13 10:39:38.112105] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1996906 ] 00:08:21.802 EAL: No free 2048 kB hugepages reported on node 1 00:08:22.061 [2024-07-13 10:39:38.292509] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:22.061 [2024-07-13 10:39:38.312522] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:22.061 [2024-07-13 10:39:38.312661] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:22.061 [2024-07-13 10:39:38.364079] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:22.061 [2024-07-13 10:39:38.380389] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4417 *** 00:08:22.061 INFO: Running with entropic power schedule (0xFF, 100). 00:08:22.061 INFO: Seed: 1752513415 00:08:22.061 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x26ac74c, 0x26ffca9), 00:08:22.061 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x26ffcb0,0x2c35280), 00:08:22.061 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:08:22.061 INFO: A corpus is not provided, starting from an empty corpus 00:08:22.061 #2 INITED exec/s: 0 rss: 60Mb 00:08:22.061 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:22.061 This may also happen if the target rejected all inputs we tried so far 00:08:22.319 [2024-07-13 10:39:38.448659] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.319 [2024-07-13 10:39:38.448695] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.319 [2024-07-13 10:39:38.448815] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.319 [2024-07-13 10:39:38.448837] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.578 NEW_FUNC[1/672]: 0x4b8420 in fuzz_nvm_write_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:540 00:08:22.578 NEW_FUNC[2/672]: 0x4dac50 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:22.578 #4 NEW cov: 11615 ft: 11616 corp: 2/68b lim: 120 exec/s: 0 rss: 68Mb L: 67/67 MS: 2 ShuffleBytes-InsertRepeatedBytes- 00:08:22.578 [2024-07-13 10:39:38.779512] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.578 [2024-07-13 10:39:38.779547] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.578 [2024-07-13 10:39:38.779663] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.578 [2024-07-13 10:39:38.779684] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.578 #5 NEW cov: 11728 ft: 12223 corp: 3/135b lim: 120 exec/s: 0 rss: 68Mb L: 67/67 MS: 1 CrossOver- 00:08:22.579 [2024-07-13 10:39:38.829612] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.579 [2024-07-13 10:39:38.829647] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.579 [2024-07-13 10:39:38.829761] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.579 [2024-07-13 10:39:38.829782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.579 #6 NEW cov: 11734 ft: 12493 corp: 4/200b lim: 120 exec/s: 0 rss: 68Mb L: 65/67 MS: 1 InsertRepeatedBytes- 00:08:22.579 [2024-07-13 10:39:38.869803] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.579 [2024-07-13 10:39:38.869833] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.579 [2024-07-13 10:39:38.869957] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.579 [2024-07-13 10:39:38.869980] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.579 #7 NEW cov: 11819 ft: 12776 corp: 5/265b lim: 120 exec/s: 0 rss: 68Mb L: 65/67 MS: 1 CMP- DE: "\200\000\000\000"- 00:08:22.579 [2024-07-13 10:39:38.919869] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.579 [2024-07-13 10:39:38.919902] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.579 [2024-07-13 10:39:38.920027] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744065119617024 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.579 [2024-07-13 10:39:38.920051] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.579 #8 NEW cov: 11819 ft: 12829 corp: 6/332b lim: 120 exec/s: 0 rss: 68Mb L: 67/67 MS: 1 ChangeBinInt- 00:08:22.579 [2024-07-13 10:39:38.960310] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.579 [2024-07-13 10:39:38.960344] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.579 [2024-07-13 10:39:38.960457] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.579 [2024-07-13 10:39:38.960481] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.579 [2024-07-13 10:39:38.960594] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446741874686296319 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.579 [2024-07-13 10:39:38.960619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.838 #9 NEW cov: 11819 ft: 13266 corp: 7/422b lim: 120 exec/s: 0 rss: 69Mb L: 90/90 MS: 1 CopyPart- 00:08:22.838 [2024-07-13 10:39:39.010260] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.838 [2024-07-13 10:39:39.010286] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.838 [2024-07-13 10:39:39.010411] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.838 [2024-07-13 10:39:39.010430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.838 #10 NEW cov: 11819 ft: 13343 corp: 8/489b lim: 120 exec/s: 0 rss: 69Mb L: 67/90 MS: 1 ChangeByte- 00:08:22.838 [2024-07-13 10:39:39.050415] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.839 [2024-07-13 10:39:39.050447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.839 [2024-07-13 10:39:39.050572] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:2049 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.839 [2024-07-13 10:39:39.050596] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.839 #11 NEW cov: 11819 ft: 13388 corp: 9/554b lim: 120 exec/s: 0 rss: 69Mb L: 65/90 MS: 1 ChangeBit- 00:08:22.839 [2024-07-13 10:39:39.100467] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.839 [2024-07-13 10:39:39.100499] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.839 [2024-07-13 10:39:39.100630] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:2049 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.839 [2024-07-13 10:39:39.100647] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.839 #12 NEW cov: 11819 ft: 13405 corp: 10/623b lim: 120 exec/s: 0 rss: 69Mb L: 69/90 MS: 1 PersAutoDict- DE: "\200\000\000\000"- 00:08:22.839 [2024-07-13 10:39:39.140939] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.839 [2024-07-13 10:39:39.140970] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.839 [2024-07-13 10:39:39.141099] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:2049 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.839 [2024-07-13 10:39:39.141123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.839 [2024-07-13 10:39:39.141241] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.839 [2024-07-13 10:39:39.141263] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.839 #13 NEW cov: 11819 ft: 13520 corp: 11/705b lim: 120 exec/s: 0 rss: 69Mb L: 82/90 MS: 1 CrossOver- 00:08:22.839 [2024-07-13 10:39:39.181245] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.839 [2024-07-13 10:39:39.181277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.839 [2024-07-13 10:39:39.181379] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:138 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.839 [2024-07-13 10:39:39.181407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.839 [2024-07-13 10:39:39.181524] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:9910603678816504201 len:35210 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.839 [2024-07-13 10:39:39.181545] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.839 [2024-07-13 10:39:39.181665] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:9910452457312979337 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.839 [2024-07-13 10:39:39.181682] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:22.839 #14 NEW cov: 11819 ft: 13895 corp: 12/815b lim: 120 exec/s: 0 rss: 69Mb L: 110/110 MS: 1 InsertRepeatedBytes- 00:08:22.839 [2024-07-13 10:39:39.220887] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.839 [2024-07-13 10:39:39.220916] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.839 [2024-07-13 10:39:39.221039] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:2049 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.839 [2024-07-13 10:39:39.221062] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.099 #15 NEW cov: 11819 ft: 13902 corp: 13/880b lim: 120 exec/s: 0 rss: 69Mb L: 65/110 MS: 1 ChangeBit- 00:08:23.099 [2024-07-13 10:39:39.261578] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.099 [2024-07-13 10:39:39.261606] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.099 [2024-07-13 10:39:39.261716] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:138 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.099 [2024-07-13 10:39:39.261739] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.099 [2024-07-13 10:39:39.261856] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:9910603678816504201 len:35210 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.099 [2024-07-13 10:39:39.261875] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.099 [2024-07-13 10:39:39.261994] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:9910452457312979337 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.099 [2024-07-13 10:39:39.262013] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:23.099 #16 NEW cov: 11819 ft: 13968 corp: 14/990b lim: 120 exec/s: 0 rss: 69Mb L: 110/110 MS: 1 ChangeBit- 00:08:23.099 [2024-07-13 10:39:39.311608] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.099 [2024-07-13 10:39:39.311641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.099 [2024-07-13 10:39:39.311729] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.099 [2024-07-13 10:39:39.311753] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.099 [2024-07-13 10:39:39.311884] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:38713293312884736 len:35210 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.099 [2024-07-13 10:39:39.311905] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.099 [2024-07-13 10:39:39.312027] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:9910603678816504201 len:35210 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.099 [2024-07-13 10:39:39.312049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:23.099 NEW_FUNC[1/1]: 0x197bcf0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:23.099 #17 NEW cov: 11842 ft: 14022 corp: 15/1103b lim: 120 exec/s: 0 rss: 70Mb L: 113/113 MS: 1 CrossOver- 00:08:23.099 [2024-07-13 10:39:39.361572] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.099 [2024-07-13 10:39:39.361610] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.099 [2024-07-13 10:39:39.361721] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:2049 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.099 [2024-07-13 10:39:39.361743] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.099 [2024-07-13 10:39:39.361862] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.099 [2024-07-13 10:39:39.361881] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.099 #18 NEW cov: 11842 ft: 14062 corp: 16/1185b lim: 120 exec/s: 0 rss: 70Mb L: 82/113 MS: 1 ShuffleBytes- 00:08:23.099 [2024-07-13 10:39:39.401498] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.099 [2024-07-13 10:39:39.401530] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.099 [2024-07-13 10:39:39.401660] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744065119617024 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.099 [2024-07-13 10:39:39.401684] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.099 #19 NEW cov: 11842 ft: 14157 corp: 17/1252b lim: 120 exec/s: 19 rss: 70Mb L: 67/113 MS: 1 CrossOver- 00:08:23.099 [2024-07-13 10:39:39.441366] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.099 [2024-07-13 10:39:39.441399] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.099 [2024-07-13 10:39:39.441542] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.099 [2024-07-13 10:39:39.441562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.099 #20 NEW cov: 11842 ft: 14216 corp: 18/1302b lim: 120 exec/s: 20 rss: 70Mb L: 50/113 MS: 1 EraseBytes- 00:08:23.099 [2024-07-13 10:39:39.481865] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.099 [2024-07-13 10:39:39.481899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.099 [2024-07-13 10:39:39.482015] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.099 [2024-07-13 10:39:39.482040] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.099 [2024-07-13 10:39:39.482163] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.099 [2024-07-13 10:39:39.482184] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.358 #22 NEW cov: 11842 ft: 14231 corp: 19/1396b lim: 120 exec/s: 22 rss: 70Mb L: 94/113 MS: 2 ShuffleBytes-InsertRepeatedBytes- 00:08:23.358 [2024-07-13 10:39:39.521783] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.358 [2024-07-13 10:39:39.521818] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.358 [2024-07-13 10:39:39.521944] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:256 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.358 [2024-07-13 10:39:39.521967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.358 #23 NEW cov: 11842 ft: 14269 corp: 20/1463b lim: 120 exec/s: 23 rss: 70Mb L: 67/113 MS: 1 CMP- DE: "\001\000\000\000\000\000\000\004"- 00:08:23.358 [2024-07-13 10:39:39.561934] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.358 [2024-07-13 10:39:39.561969] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.358 [2024-07-13 10:39:39.562098] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.358 [2024-07-13 10:39:39.562122] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.358 #24 NEW cov: 11842 ft: 14299 corp: 21/1528b lim: 120 exec/s: 24 rss: 70Mb L: 65/113 MS: 1 ShuffleBytes- 00:08:23.358 [2024-07-13 10:39:39.602317] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.358 [2024-07-13 10:39:39.602353] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.358 [2024-07-13 10:39:39.602476] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744065119617024 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.358 [2024-07-13 10:39:39.602498] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.358 [2024-07-13 10:39:39.602612] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:2561 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.358 [2024-07-13 10:39:39.602633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.358 #25 NEW cov: 11842 ft: 14312 corp: 22/1611b lim: 120 exec/s: 25 rss: 70Mb L: 83/113 MS: 1 CopyPart- 00:08:23.358 [2024-07-13 10:39:39.642127] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.358 [2024-07-13 10:39:39.642162] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.358 [2024-07-13 10:39:39.642277] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.358 [2024-07-13 10:39:39.642300] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.358 #26 NEW cov: 11842 ft: 14353 corp: 23/1676b lim: 120 exec/s: 26 rss: 70Mb L: 65/113 MS: 1 ChangeBinInt- 00:08:23.358 [2024-07-13 10:39:39.692719] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.358 [2024-07-13 10:39:39.692754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.358 [2024-07-13 10:39:39.692841] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.358 [2024-07-13 10:39:39.692864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.358 [2024-07-13 10:39:39.692976] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:38713293312884736 len:35210 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.358 [2024-07-13 10:39:39.692997] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.358 [2024-07-13 10:39:39.693108] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:9910603678816504201 len:35210 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.359 [2024-07-13 10:39:39.693130] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:23.359 [2024-07-13 10:39:39.693240] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:4 nsid:0 lba:150633093005312 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.359 [2024-07-13 10:39:39.693266] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:23.359 #27 NEW cov: 11842 ft: 14398 corp: 24/1796b lim: 120 exec/s: 27 rss: 70Mb L: 120/120 MS: 1 CrossOver- 00:08:23.359 [2024-07-13 10:39:39.742425] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.359 [2024-07-13 10:39:39.742463] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.359 [2024-07-13 10:39:39.742570] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:2049 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.359 [2024-07-13 10:39:39.742593] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.617 #28 NEW cov: 11842 ft: 14427 corp: 25/1865b lim: 120 exec/s: 28 rss: 70Mb L: 69/120 MS: 1 ShuffleBytes- 00:08:23.618 [2024-07-13 10:39:39.782892] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167814144 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.618 [2024-07-13 10:39:39.782925] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.618 [2024-07-13 10:39:39.783013] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:72057589742960640 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.618 [2024-07-13 10:39:39.783033] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.618 [2024-07-13 10:39:39.783144] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:11 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.618 [2024-07-13 10:39:39.783161] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.618 #29 NEW cov: 11842 ft: 14453 corp: 26/1949b lim: 120 exec/s: 29 rss: 70Mb L: 84/120 MS: 1 InsertByte- 00:08:23.618 [2024-07-13 10:39:39.822648] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.618 [2024-07-13 10:39:39.822678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.618 [2024-07-13 10:39:39.822779] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:2049 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.618 [2024-07-13 10:39:39.822799] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.618 #30 NEW cov: 11842 ft: 14485 corp: 27/2018b lim: 120 exec/s: 30 rss: 70Mb L: 69/120 MS: 1 ShuffleBytes- 00:08:23.618 [2024-07-13 10:39:39.863112] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.618 [2024-07-13 10:39:39.863150] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.618 [2024-07-13 10:39:39.863267] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:72057589742960640 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.618 [2024-07-13 10:39:39.863290] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.618 [2024-07-13 10:39:39.863406] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:11 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.618 [2024-07-13 10:39:39.863430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.618 #31 NEW cov: 11842 ft: 14499 corp: 28/2102b lim: 120 exec/s: 31 rss: 70Mb L: 84/120 MS: 1 InsertByte- 00:08:23.618 [2024-07-13 10:39:39.903480] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.618 [2024-07-13 10:39:39.903511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.618 [2024-07-13 10:39:39.903592] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.618 [2024-07-13 10:39:39.903612] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.618 [2024-07-13 10:39:39.903721] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.618 [2024-07-13 10:39:39.903744] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.618 [2024-07-13 10:39:39.903864] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.618 [2024-07-13 10:39:39.903884] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:23.618 #32 NEW cov: 11842 ft: 14515 corp: 29/2203b lim: 120 exec/s: 32 rss: 70Mb L: 101/120 MS: 1 CopyPart- 00:08:23.618 [2024-07-13 10:39:39.943092] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:16641 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.618 [2024-07-13 10:39:39.943120] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.618 [2024-07-13 10:39:39.943223] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.618 [2024-07-13 10:39:39.943246] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.618 #33 NEW cov: 11842 ft: 14527 corp: 30/2268b lim: 120 exec/s: 33 rss: 70Mb L: 65/120 MS: 1 ChangeBinInt- 00:08:23.618 [2024-07-13 10:39:39.983756] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.618 [2024-07-13 10:39:39.983789] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.618 [2024-07-13 10:39:39.983877] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:2049 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.618 [2024-07-13 10:39:39.983899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.618 [2024-07-13 10:39:39.984012] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.618 [2024-07-13 10:39:39.984037] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.618 [2024-07-13 10:39:39.984146] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:9910603678816504217 len:35210 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.618 [2024-07-13 10:39:39.984169] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:23.877 #34 NEW cov: 11842 ft: 14541 corp: 31/2383b lim: 120 exec/s: 34 rss: 70Mb L: 115/120 MS: 1 CrossOver- 00:08:23.877 [2024-07-13 10:39:40.033675] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167814144 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.877 [2024-07-13 10:39:40.033706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.877 [2024-07-13 10:39:40.033821] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:72048793649938432 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.877 [2024-07-13 10:39:40.033843] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.877 [2024-07-13 10:39:40.033961] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:11 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.877 [2024-07-13 10:39:40.033983] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.877 #40 NEW cov: 11842 ft: 14550 corp: 32/2467b lim: 120 exec/s: 40 rss: 70Mb L: 84/120 MS: 1 ChangeBit- 00:08:23.877 [2024-07-13 10:39:40.083124] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:3276800 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.877 [2024-07-13 10:39:40.083160] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.877 [2024-07-13 10:39:40.083278] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.877 [2024-07-13 10:39:40.083301] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.877 #41 NEW cov: 11842 ft: 14561 corp: 33/2517b lim: 120 exec/s: 41 rss: 70Mb L: 50/120 MS: 1 ChangeBinInt- 00:08:23.877 [2024-07-13 10:39:40.124127] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.877 [2024-07-13 10:39:40.124158] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.877 [2024-07-13 10:39:40.124255] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:138 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.877 [2024-07-13 10:39:40.124277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.877 [2024-07-13 10:39:40.124395] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:9910603678816504201 len:35210 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.877 [2024-07-13 10:39:40.124418] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.877 [2024-07-13 10:39:40.124537] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:9910452457312979337 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.877 [2024-07-13 10:39:40.124560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:23.877 #42 NEW cov: 11842 ft: 14623 corp: 34/2627b lim: 120 exec/s: 42 rss: 70Mb L: 110/120 MS: 1 ChangeBit- 00:08:23.877 [2024-07-13 10:39:40.163952] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.877 [2024-07-13 10:39:40.163987] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.877 [2024-07-13 10:39:40.164105] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.877 [2024-07-13 10:39:40.164123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.877 [2024-07-13 10:39:40.164246] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446741874686296319 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.877 [2024-07-13 10:39:40.164267] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.877 #43 NEW cov: 11842 ft: 14628 corp: 35/2721b lim: 120 exec/s: 43 rss: 70Mb L: 94/120 MS: 1 PersAutoDict- DE: "\200\000\000\000"- 00:08:23.877 [2024-07-13 10:39:40.214124] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:1099511627776 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.877 [2024-07-13 10:39:40.214156] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.877 [2024-07-13 10:39:40.214267] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.877 [2024-07-13 10:39:40.214288] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.877 [2024-07-13 10:39:40.214404] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.877 [2024-07-13 10:39:40.214423] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.877 #44 NEW cov: 11842 ft: 14639 corp: 36/2811b lim: 120 exec/s: 44 rss: 70Mb L: 90/120 MS: 1 PersAutoDict- DE: "\001\000\000\000\000\000\000\004"- 00:08:23.878 [2024-07-13 10:39:40.264030] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.878 [2024-07-13 10:39:40.264057] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.878 [2024-07-13 10:39:40.264145] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.878 [2024-07-13 10:39:40.264165] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.137 #45 NEW cov: 11842 ft: 14651 corp: 37/2876b lim: 120 exec/s: 45 rss: 70Mb L: 65/120 MS: 1 ChangeBit- 00:08:24.137 [2024-07-13 10:39:40.304382] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.137 [2024-07-13 10:39:40.304412] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.137 [2024-07-13 10:39:40.304494] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.137 [2024-07-13 10:39:40.304520] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.137 [2024-07-13 10:39:40.304656] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.137 [2024-07-13 10:39:40.304677] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.137 #46 NEW cov: 11842 ft: 14663 corp: 38/2955b lim: 120 exec/s: 46 rss: 70Mb L: 79/120 MS: 1 CrossOver- 00:08:24.137 [2024-07-13 10:39:40.354326] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:3276800 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.137 [2024-07-13 10:39:40.354352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.137 [2024-07-13 10:39:40.354481] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.137 [2024-07-13 10:39:40.354502] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.137 #47 NEW cov: 11842 ft: 14680 corp: 39/3021b lim: 120 exec/s: 47 rss: 71Mb L: 66/120 MS: 1 CopyPart- 00:08:24.137 [2024-07-13 10:39:40.394577] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.137 [2024-07-13 10:39:40.394609] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.137 [2024-07-13 10:39:40.394725] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.137 [2024-07-13 10:39:40.394748] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.137 #48 NEW cov: 11842 ft: 14776 corp: 40/3090b lim: 120 exec/s: 48 rss: 71Mb L: 69/120 MS: 1 PersAutoDict- DE: "\200\000\000\000"- 00:08:24.137 [2024-07-13 10:39:40.434865] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.137 [2024-07-13 10:39:40.434895] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.137 [2024-07-13 10:39:40.435009] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.137 [2024-07-13 10:39:40.435032] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.137 [2024-07-13 10:39:40.435148] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446741874686296319 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.137 [2024-07-13 10:39:40.435171] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.137 #49 NEW cov: 11842 ft: 14782 corp: 41/3184b lim: 120 exec/s: 24 rss: 71Mb L: 94/120 MS: 1 ChangeBinInt- 00:08:24.137 #49 DONE cov: 11842 ft: 14782 corp: 41/3184b lim: 120 exec/s: 24 rss: 71Mb 00:08:24.137 ###### Recommended dictionary. ###### 00:08:24.137 "\200\000\000\000" # Uses: 4 00:08:24.137 "\001\000\000\000\000\000\000\004" # Uses: 1 00:08:24.137 ###### End of recommended dictionary. ###### 00:08:24.137 Done 49 runs in 2 second(s) 00:08:24.396 10:39:40 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_17.conf 00:08:24.396 10:39:40 -- ../common.sh@72 -- # (( i++ )) 00:08:24.396 10:39:40 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:24.396 10:39:40 -- ../common.sh@73 -- # start_llvm_fuzz 18 1 0x1 00:08:24.396 10:39:40 -- nvmf/run.sh@23 -- # local fuzzer_type=18 00:08:24.396 10:39:40 -- nvmf/run.sh@24 -- # local timen=1 00:08:24.396 10:39:40 -- nvmf/run.sh@25 -- # local core=0x1 00:08:24.396 10:39:40 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:08:24.396 10:39:40 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_18.conf 00:08:24.396 10:39:40 -- nvmf/run.sh@29 -- # printf %02d 18 00:08:24.396 10:39:40 -- nvmf/run.sh@29 -- # port=4418 00:08:24.396 10:39:40 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:08:24.396 10:39:40 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4418' 00:08:24.396 10:39:40 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4418"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:24.396 10:39:40 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4418' -c /tmp/fuzz_json_18.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 -Z 18 -r /var/tmp/spdk18.sock 00:08:24.396 [2024-07-13 10:39:40.599240] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:08:24.396 [2024-07-13 10:39:40.599296] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1997426 ] 00:08:24.396 EAL: No free 2048 kB hugepages reported on node 1 00:08:24.396 [2024-07-13 10:39:40.769095] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:24.656 [2024-07-13 10:39:40.788320] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:24.656 [2024-07-13 10:39:40.788470] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:24.656 [2024-07-13 10:39:40.839885] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:24.656 [2024-07-13 10:39:40.856162] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4418 *** 00:08:24.656 INFO: Running with entropic power schedule (0xFF, 100). 00:08:24.656 INFO: Seed: 4230572562 00:08:24.656 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x26ac74c, 0x26ffca9), 00:08:24.656 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x26ffcb0,0x2c35280), 00:08:24.656 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:08:24.656 INFO: A corpus is not provided, starting from an empty corpus 00:08:24.656 #2 INITED exec/s: 0 rss: 61Mb 00:08:24.656 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:24.656 This may also happen if the target rejected all inputs we tried so far 00:08:24.656 [2024-07-13 10:39:40.901404] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:24.656 [2024-07-13 10:39:40.901433] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.656 [2024-07-13 10:39:40.901499] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:24.656 [2024-07-13 10:39:40.901514] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.656 [2024-07-13 10:39:40.901564] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:24.656 [2024-07-13 10:39:40.901578] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.915 NEW_FUNC[1/670]: 0x4bbc80 in fuzz_nvm_write_zeroes_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:562 00:08:24.915 NEW_FUNC[2/670]: 0x4dac50 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:24.915 #4 NEW cov: 11559 ft: 11560 corp: 2/61b lim: 100 exec/s: 0 rss: 68Mb L: 60/60 MS: 2 InsertByte-InsertRepeatedBytes- 00:08:24.915 [2024-07-13 10:39:41.212167] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:24.915 [2024-07-13 10:39:41.212199] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.915 [2024-07-13 10:39:41.212251] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:24.915 [2024-07-13 10:39:41.212265] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.915 [2024-07-13 10:39:41.212316] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:24.915 [2024-07-13 10:39:41.212330] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.915 #10 NEW cov: 11672 ft: 12089 corp: 3/121b lim: 100 exec/s: 0 rss: 68Mb L: 60/60 MS: 1 ChangeBinInt- 00:08:24.915 [2024-07-13 10:39:41.252200] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:24.915 [2024-07-13 10:39:41.252228] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.915 [2024-07-13 10:39:41.252271] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:24.915 [2024-07-13 10:39:41.252286] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.915 [2024-07-13 10:39:41.252336] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:24.915 [2024-07-13 10:39:41.252351] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.915 #11 NEW cov: 11678 ft: 12267 corp: 4/181b lim: 100 exec/s: 0 rss: 68Mb L: 60/60 MS: 1 ChangeBinInt- 00:08:24.915 [2024-07-13 10:39:41.292336] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:24.915 [2024-07-13 10:39:41.292362] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.915 [2024-07-13 10:39:41.292412] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:24.915 [2024-07-13 10:39:41.292428] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.915 [2024-07-13 10:39:41.292483] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:24.915 [2024-07-13 10:39:41.292497] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:25.175 #12 NEW cov: 11763 ft: 12544 corp: 5/241b lim: 100 exec/s: 0 rss: 68Mb L: 60/60 MS: 1 CMP- DE: "\000\037"- 00:08:25.175 [2024-07-13 10:39:41.332349] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:25.175 [2024-07-13 10:39:41.332374] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.175 [2024-07-13 10:39:41.332422] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:25.175 [2024-07-13 10:39:41.332436] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.175 #13 NEW cov: 11763 ft: 12980 corp: 6/292b lim: 100 exec/s: 0 rss: 69Mb L: 51/60 MS: 1 EraseBytes- 00:08:25.175 [2024-07-13 10:39:41.372435] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:25.175 [2024-07-13 10:39:41.372465] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.175 [2024-07-13 10:39:41.372524] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:25.175 [2024-07-13 10:39:41.372538] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.175 #14 NEW cov: 11763 ft: 13083 corp: 7/343b lim: 100 exec/s: 0 rss: 69Mb L: 51/60 MS: 1 ShuffleBytes- 00:08:25.175 [2024-07-13 10:39:41.412714] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:25.175 [2024-07-13 10:39:41.412742] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.175 [2024-07-13 10:39:41.412777] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:25.175 [2024-07-13 10:39:41.412790] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.175 [2024-07-13 10:39:41.412841] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:25.175 [2024-07-13 10:39:41.412856] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:25.175 #15 NEW cov: 11763 ft: 13155 corp: 8/403b lim: 100 exec/s: 0 rss: 69Mb L: 60/60 MS: 1 ChangeBinInt- 00:08:25.175 [2024-07-13 10:39:41.452811] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:25.175 [2024-07-13 10:39:41.452838] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.175 [2024-07-13 10:39:41.452870] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:25.175 [2024-07-13 10:39:41.452882] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.175 [2024-07-13 10:39:41.452933] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:25.176 [2024-07-13 10:39:41.452947] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:25.176 #16 NEW cov: 11763 ft: 13166 corp: 9/466b lim: 100 exec/s: 0 rss: 69Mb L: 63/63 MS: 1 CopyPart- 00:08:25.176 [2024-07-13 10:39:41.493066] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:25.176 [2024-07-13 10:39:41.493092] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.176 [2024-07-13 10:39:41.493138] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:25.176 [2024-07-13 10:39:41.493152] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.176 [2024-07-13 10:39:41.493201] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:25.176 [2024-07-13 10:39:41.493230] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:25.176 [2024-07-13 10:39:41.493280] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:25.176 [2024-07-13 10:39:41.493294] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:25.176 #17 NEW cov: 11763 ft: 13489 corp: 10/564b lim: 100 exec/s: 0 rss: 69Mb L: 98/98 MS: 1 InsertRepeatedBytes- 00:08:25.176 [2024-07-13 10:39:41.533094] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:25.176 [2024-07-13 10:39:41.533121] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.176 [2024-07-13 10:39:41.533153] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:25.176 [2024-07-13 10:39:41.533168] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.176 [2024-07-13 10:39:41.533215] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:25.176 [2024-07-13 10:39:41.533230] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:25.176 #18 NEW cov: 11763 ft: 13520 corp: 11/624b lim: 100 exec/s: 0 rss: 69Mb L: 60/98 MS: 1 ChangeByte- 00:08:25.436 [2024-07-13 10:39:41.573225] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:25.436 [2024-07-13 10:39:41.573251] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.436 [2024-07-13 10:39:41.573288] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:25.436 [2024-07-13 10:39:41.573303] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.436 [2024-07-13 10:39:41.573354] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:25.436 [2024-07-13 10:39:41.573367] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:25.436 #19 NEW cov: 11763 ft: 13607 corp: 12/684b lim: 100 exec/s: 0 rss: 69Mb L: 60/98 MS: 1 CrossOver- 00:08:25.436 [2024-07-13 10:39:41.613337] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:25.436 [2024-07-13 10:39:41.613363] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.436 [2024-07-13 10:39:41.613403] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:25.436 [2024-07-13 10:39:41.613415] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.436 [2024-07-13 10:39:41.613468] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:25.436 [2024-07-13 10:39:41.613482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:25.436 #20 NEW cov: 11763 ft: 13711 corp: 13/748b lim: 100 exec/s: 0 rss: 70Mb L: 64/98 MS: 1 InsertRepeatedBytes- 00:08:25.436 [2024-07-13 10:39:41.653423] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:25.436 [2024-07-13 10:39:41.653453] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.436 [2024-07-13 10:39:41.653486] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:25.436 [2024-07-13 10:39:41.653501] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.436 [2024-07-13 10:39:41.653551] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:25.436 [2024-07-13 10:39:41.653565] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:25.436 #21 NEW cov: 11763 ft: 13734 corp: 14/810b lim: 100 exec/s: 0 rss: 70Mb L: 62/98 MS: 1 PersAutoDict- DE: "\000\037"- 00:08:25.436 [2024-07-13 10:39:41.693548] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:25.436 [2024-07-13 10:39:41.693574] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.436 [2024-07-13 10:39:41.693608] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:25.436 [2024-07-13 10:39:41.693621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.436 [2024-07-13 10:39:41.693672] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:25.436 [2024-07-13 10:39:41.693687] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:25.436 #22 NEW cov: 11763 ft: 13760 corp: 15/872b lim: 100 exec/s: 0 rss: 70Mb L: 62/98 MS: 1 ChangeBit- 00:08:25.436 [2024-07-13 10:39:41.733624] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:25.436 [2024-07-13 10:39:41.733650] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.436 [2024-07-13 10:39:41.733683] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:25.436 [2024-07-13 10:39:41.733697] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.436 [2024-07-13 10:39:41.733750] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:25.436 [2024-07-13 10:39:41.733765] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:25.436 #24 NEW cov: 11763 ft: 13789 corp: 16/933b lim: 100 exec/s: 0 rss: 70Mb L: 61/98 MS: 2 ChangeByte-InsertRepeatedBytes- 00:08:25.436 [2024-07-13 10:39:41.773743] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:25.436 [2024-07-13 10:39:41.773775] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.437 [2024-07-13 10:39:41.773815] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:25.437 [2024-07-13 10:39:41.773829] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.437 [2024-07-13 10:39:41.773881] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:25.437 [2024-07-13 10:39:41.773896] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:25.437 NEW_FUNC[1/1]: 0x197bcf0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:25.437 #25 NEW cov: 11786 ft: 13858 corp: 17/993b lim: 100 exec/s: 0 rss: 70Mb L: 60/98 MS: 1 ChangeByte- 00:08:25.437 [2024-07-13 10:39:41.813612] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:25.437 [2024-07-13 10:39:41.813638] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.697 #31 NEW cov: 11786 ft: 14212 corp: 18/1025b lim: 100 exec/s: 0 rss: 70Mb L: 32/98 MS: 1 EraseBytes- 00:08:25.697 [2024-07-13 10:39:41.853983] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:25.697 [2024-07-13 10:39:41.854009] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.697 [2024-07-13 10:39:41.854044] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:25.697 [2024-07-13 10:39:41.854059] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.697 [2024-07-13 10:39:41.854110] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:25.697 [2024-07-13 10:39:41.854125] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:25.697 #32 NEW cov: 11786 ft: 14224 corp: 19/1087b lim: 100 exec/s: 0 rss: 70Mb L: 62/98 MS: 1 ChangeBit- 00:08:25.697 [2024-07-13 10:39:41.894063] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:25.697 [2024-07-13 10:39:41.894089] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.697 [2024-07-13 10:39:41.894124] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:25.697 [2024-07-13 10:39:41.894138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.697 [2024-07-13 10:39:41.894188] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:25.697 [2024-07-13 10:39:41.894203] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:25.697 #33 NEW cov: 11786 ft: 14248 corp: 20/1148b lim: 100 exec/s: 33 rss: 70Mb L: 61/98 MS: 1 InsertByte- 00:08:25.697 [2024-07-13 10:39:41.924160] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:25.697 [2024-07-13 10:39:41.924186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.698 [2024-07-13 10:39:41.924235] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:25.698 [2024-07-13 10:39:41.924249] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.698 [2024-07-13 10:39:41.924299] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:25.698 [2024-07-13 10:39:41.924314] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:25.698 #34 NEW cov: 11786 ft: 14311 corp: 21/1208b lim: 100 exec/s: 34 rss: 70Mb L: 60/98 MS: 1 ChangeByte- 00:08:25.698 [2024-07-13 10:39:41.964273] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:25.698 [2024-07-13 10:39:41.964299] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.698 [2024-07-13 10:39:41.964335] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:25.698 [2024-07-13 10:39:41.964349] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.698 [2024-07-13 10:39:41.964399] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:25.698 [2024-07-13 10:39:41.964413] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:25.698 #35 NEW cov: 11786 ft: 14345 corp: 22/1268b lim: 100 exec/s: 35 rss: 70Mb L: 60/98 MS: 1 ChangeBit- 00:08:25.698 [2024-07-13 10:39:41.994402] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:25.698 [2024-07-13 10:39:41.994428] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.698 [2024-07-13 10:39:41.994468] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:25.698 [2024-07-13 10:39:41.994481] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.698 [2024-07-13 10:39:41.994533] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:25.698 [2024-07-13 10:39:41.994547] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:25.698 #36 NEW cov: 11786 ft: 14390 corp: 23/1328b lim: 100 exec/s: 36 rss: 70Mb L: 60/98 MS: 1 ShuffleBytes- 00:08:25.698 [2024-07-13 10:39:42.034470] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:25.698 [2024-07-13 10:39:42.034495] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.698 [2024-07-13 10:39:42.034536] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:25.698 [2024-07-13 10:39:42.034551] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.698 [2024-07-13 10:39:42.034603] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:25.698 [2024-07-13 10:39:42.034633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:25.698 #37 NEW cov: 11786 ft: 14428 corp: 24/1392b lim: 100 exec/s: 37 rss: 70Mb L: 64/98 MS: 1 ChangeBit- 00:08:25.698 [2024-07-13 10:39:42.074640] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:25.698 [2024-07-13 10:39:42.074666] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.698 [2024-07-13 10:39:42.074719] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:25.698 [2024-07-13 10:39:42.074733] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.698 [2024-07-13 10:39:42.074785] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:25.698 [2024-07-13 10:39:42.074800] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:25.958 #38 NEW cov: 11786 ft: 14434 corp: 25/1454b lim: 100 exec/s: 38 rss: 70Mb L: 62/98 MS: 1 CMP- DE: "\377\014"- 00:08:25.958 [2024-07-13 10:39:42.114863] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:25.958 [2024-07-13 10:39:42.114892] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.958 [2024-07-13 10:39:42.114927] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:25.958 [2024-07-13 10:39:42.114942] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.958 [2024-07-13 10:39:42.114995] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:25.959 [2024-07-13 10:39:42.115009] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:25.959 [2024-07-13 10:39:42.115061] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:25.959 [2024-07-13 10:39:42.115075] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:25.959 #39 NEW cov: 11786 ft: 14448 corp: 26/1540b lim: 100 exec/s: 39 rss: 70Mb L: 86/98 MS: 1 CrossOver- 00:08:25.959 [2024-07-13 10:39:42.154973] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:25.959 [2024-07-13 10:39:42.154999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.959 [2024-07-13 10:39:42.155043] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:25.959 [2024-07-13 10:39:42.155058] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.959 [2024-07-13 10:39:42.155108] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:25.959 [2024-07-13 10:39:42.155121] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:25.959 [2024-07-13 10:39:42.155171] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:25.959 [2024-07-13 10:39:42.155185] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:25.959 #40 NEW cov: 11786 ft: 14464 corp: 27/1631b lim: 100 exec/s: 40 rss: 70Mb L: 91/98 MS: 1 CrossOver- 00:08:25.959 [2024-07-13 10:39:42.194980] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:25.959 [2024-07-13 10:39:42.195007] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.959 [2024-07-13 10:39:42.195041] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:25.959 [2024-07-13 10:39:42.195055] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.959 [2024-07-13 10:39:42.195106] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:25.959 [2024-07-13 10:39:42.195119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:25.959 #41 NEW cov: 11786 ft: 14476 corp: 28/1693b lim: 100 exec/s: 41 rss: 70Mb L: 62/98 MS: 1 CMP- DE: "\001\000\000\000"- 00:08:25.959 [2024-07-13 10:39:42.235074] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:25.959 [2024-07-13 10:39:42.235099] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.959 [2024-07-13 10:39:42.235136] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:25.959 [2024-07-13 10:39:42.235150] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.959 [2024-07-13 10:39:42.235199] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:25.959 [2024-07-13 10:39:42.235216] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:25.959 #42 NEW cov: 11786 ft: 14521 corp: 29/1754b lim: 100 exec/s: 42 rss: 70Mb L: 61/98 MS: 1 ChangeBinInt- 00:08:25.959 [2024-07-13 10:39:42.275248] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:25.959 [2024-07-13 10:39:42.275273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.959 [2024-07-13 10:39:42.275335] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:25.959 [2024-07-13 10:39:42.275350] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.959 [2024-07-13 10:39:42.275401] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:25.959 [2024-07-13 10:39:42.275415] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:25.959 #43 NEW cov: 11786 ft: 14543 corp: 30/1817b lim: 100 exec/s: 43 rss: 70Mb L: 63/98 MS: 1 PersAutoDict- DE: "\377\014"- 00:08:25.959 [2024-07-13 10:39:42.315301] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:25.959 [2024-07-13 10:39:42.315326] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.959 [2024-07-13 10:39:42.315362] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:25.959 [2024-07-13 10:39:42.315377] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.959 [2024-07-13 10:39:42.315428] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:25.959 [2024-07-13 10:39:42.315461] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:25.959 #44 NEW cov: 11786 ft: 14561 corp: 31/1889b lim: 100 exec/s: 44 rss: 70Mb L: 72/98 MS: 1 CMP- DE: "\036\317\361\266\301h)\000"- 00:08:26.218 [2024-07-13 10:39:42.355440] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:26.219 [2024-07-13 10:39:42.355470] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.219 [2024-07-13 10:39:42.355508] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:26.219 [2024-07-13 10:39:42.355524] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.219 [2024-07-13 10:39:42.355574] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:26.219 [2024-07-13 10:39:42.355587] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.219 #45 NEW cov: 11786 ft: 14571 corp: 32/1949b lim: 100 exec/s: 45 rss: 70Mb L: 60/98 MS: 1 ShuffleBytes- 00:08:26.219 [2024-07-13 10:39:42.395464] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:26.219 [2024-07-13 10:39:42.395489] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.219 [2024-07-13 10:39:42.395532] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:26.219 [2024-07-13 10:39:42.395547] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.219 #46 NEW cov: 11786 ft: 14606 corp: 33/2000b lim: 100 exec/s: 46 rss: 71Mb L: 51/98 MS: 1 ChangeBit- 00:08:26.219 [2024-07-13 10:39:42.435747] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:26.219 [2024-07-13 10:39:42.435773] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.219 [2024-07-13 10:39:42.435812] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:26.219 [2024-07-13 10:39:42.435824] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.219 [2024-07-13 10:39:42.435874] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:26.219 [2024-07-13 10:39:42.435889] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.219 [2024-07-13 10:39:42.435939] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:26.219 [2024-07-13 10:39:42.435953] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:26.219 #47 NEW cov: 11786 ft: 14617 corp: 34/2098b lim: 100 exec/s: 47 rss: 71Mb L: 98/98 MS: 1 PersAutoDict- DE: "\001\000\000\000"- 00:08:26.219 [2024-07-13 10:39:42.475863] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:26.219 [2024-07-13 10:39:42.475889] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.219 [2024-07-13 10:39:42.475929] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:26.219 [2024-07-13 10:39:42.475943] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.219 [2024-07-13 10:39:42.475994] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:26.219 [2024-07-13 10:39:42.476009] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.219 [2024-07-13 10:39:42.476059] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:26.219 [2024-07-13 10:39:42.476074] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:26.219 #48 NEW cov: 11786 ft: 14626 corp: 35/2184b lim: 100 exec/s: 48 rss: 71Mb L: 86/98 MS: 1 ChangeBinInt- 00:08:26.219 [2024-07-13 10:39:42.515867] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:26.219 [2024-07-13 10:39:42.515892] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.219 [2024-07-13 10:39:42.515928] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:26.219 [2024-07-13 10:39:42.515942] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.219 [2024-07-13 10:39:42.515994] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:26.219 [2024-07-13 10:39:42.516008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.219 #49 NEW cov: 11786 ft: 14628 corp: 36/2249b lim: 100 exec/s: 49 rss: 71Mb L: 65/98 MS: 1 PersAutoDict- DE: "\000\037"- 00:08:26.219 [2024-07-13 10:39:42.555977] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:26.219 [2024-07-13 10:39:42.556003] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.219 [2024-07-13 10:39:42.556055] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:26.219 [2024-07-13 10:39:42.556069] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.219 [2024-07-13 10:39:42.556122] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:26.219 [2024-07-13 10:39:42.556135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.219 #50 NEW cov: 11786 ft: 14651 corp: 37/2310b lim: 100 exec/s: 50 rss: 71Mb L: 61/98 MS: 1 CopyPart- 00:08:26.219 [2024-07-13 10:39:42.596096] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:26.219 [2024-07-13 10:39:42.596122] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.219 [2024-07-13 10:39:42.596160] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:26.219 [2024-07-13 10:39:42.596174] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.219 [2024-07-13 10:39:42.596226] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:26.219 [2024-07-13 10:39:42.596238] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.479 #51 NEW cov: 11786 ft: 14653 corp: 38/2371b lim: 100 exec/s: 51 rss: 71Mb L: 61/98 MS: 1 InsertByte- 00:08:26.479 [2024-07-13 10:39:42.636219] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:26.479 [2024-07-13 10:39:42.636246] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.479 [2024-07-13 10:39:42.636282] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:26.479 [2024-07-13 10:39:42.636296] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.479 [2024-07-13 10:39:42.636348] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:26.479 [2024-07-13 10:39:42.636364] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.479 #52 NEW cov: 11786 ft: 14674 corp: 39/2441b lim: 100 exec/s: 52 rss: 71Mb L: 70/98 MS: 1 PersAutoDict- DE: "\036\317\361\266\301h)\000"- 00:08:26.479 [2024-07-13 10:39:42.676362] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:26.479 [2024-07-13 10:39:42.676388] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.479 [2024-07-13 10:39:42.676423] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:26.479 [2024-07-13 10:39:42.676436] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.479 [2024-07-13 10:39:42.676507] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:26.479 [2024-07-13 10:39:42.676522] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.479 #53 NEW cov: 11786 ft: 14684 corp: 40/2511b lim: 100 exec/s: 53 rss: 71Mb L: 70/98 MS: 1 ChangeByte- 00:08:26.479 [2024-07-13 10:39:42.716505] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:26.479 [2024-07-13 10:39:42.716531] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.479 [2024-07-13 10:39:42.716566] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:26.479 [2024-07-13 10:39:42.716581] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.479 [2024-07-13 10:39:42.716631] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:26.479 [2024-07-13 10:39:42.716646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.479 #54 NEW cov: 11786 ft: 14697 corp: 41/2571b lim: 100 exec/s: 54 rss: 71Mb L: 60/98 MS: 1 CopyPart- 00:08:26.479 [2024-07-13 10:39:42.756588] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:26.479 [2024-07-13 10:39:42.756614] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.479 [2024-07-13 10:39:42.756652] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:26.479 [2024-07-13 10:39:42.756665] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.479 [2024-07-13 10:39:42.756716] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:26.479 [2024-07-13 10:39:42.756731] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.479 #55 NEW cov: 11786 ft: 14702 corp: 42/2632b lim: 100 exec/s: 55 rss: 71Mb L: 61/98 MS: 1 ChangeBinInt- 00:08:26.479 [2024-07-13 10:39:42.786654] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:26.479 [2024-07-13 10:39:42.786680] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.479 [2024-07-13 10:39:42.786719] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:26.479 [2024-07-13 10:39:42.786734] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.479 [2024-07-13 10:39:42.786785] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:26.479 [2024-07-13 10:39:42.786799] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.479 #56 NEW cov: 11786 ft: 14766 corp: 43/2697b lim: 100 exec/s: 56 rss: 71Mb L: 65/98 MS: 1 CopyPart- 00:08:26.479 [2024-07-13 10:39:42.816720] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:26.479 [2024-07-13 10:39:42.816746] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.479 [2024-07-13 10:39:42.816781] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:26.479 [2024-07-13 10:39:42.816796] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.479 [2024-07-13 10:39:42.816847] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:26.479 [2024-07-13 10:39:42.816861] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.479 #57 NEW cov: 11786 ft: 14773 corp: 44/2761b lim: 100 exec/s: 57 rss: 71Mb L: 64/98 MS: 1 ChangeASCIIInt- 00:08:26.479 [2024-07-13 10:39:42.846783] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:26.479 [2024-07-13 10:39:42.846810] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.480 [2024-07-13 10:39:42.846861] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:26.480 [2024-07-13 10:39:42.846877] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.480 [2024-07-13 10:39:42.846928] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:26.480 [2024-07-13 10:39:42.846943] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.738 #58 NEW cov: 11786 ft: 14804 corp: 45/2821b lim: 100 exec/s: 58 rss: 71Mb L: 60/98 MS: 1 ChangeBinInt- 00:08:26.738 [2024-07-13 10:39:42.886928] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:26.738 [2024-07-13 10:39:42.886954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.738 [2024-07-13 10:39:42.886998] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:26.738 [2024-07-13 10:39:42.887012] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.738 [2024-07-13 10:39:42.887064] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:26.738 [2024-07-13 10:39:42.887079] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.738 #59 NEW cov: 11786 ft: 14814 corp: 46/2882b lim: 100 exec/s: 29 rss: 72Mb L: 61/98 MS: 1 ChangeBinInt- 00:08:26.738 #59 DONE cov: 11786 ft: 14814 corp: 46/2882b lim: 100 exec/s: 29 rss: 72Mb 00:08:26.738 ###### Recommended dictionary. ###### 00:08:26.738 "\000\037" # Uses: 3 00:08:26.738 "\377\014" # Uses: 1 00:08:26.738 "\001\000\000\000" # Uses: 1 00:08:26.738 "\036\317\361\266\301h)\000" # Uses: 1 00:08:26.738 ###### End of recommended dictionary. ###### 00:08:26.738 Done 59 runs in 2 second(s) 00:08:26.738 10:39:43 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_18.conf 00:08:26.738 10:39:43 -- ../common.sh@72 -- # (( i++ )) 00:08:26.738 10:39:43 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:26.738 10:39:43 -- ../common.sh@73 -- # start_llvm_fuzz 19 1 0x1 00:08:26.738 10:39:43 -- nvmf/run.sh@23 -- # local fuzzer_type=19 00:08:26.738 10:39:43 -- nvmf/run.sh@24 -- # local timen=1 00:08:26.738 10:39:43 -- nvmf/run.sh@25 -- # local core=0x1 00:08:26.738 10:39:43 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:08:26.738 10:39:43 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_19.conf 00:08:26.738 10:39:43 -- nvmf/run.sh@29 -- # printf %02d 19 00:08:26.738 10:39:43 -- nvmf/run.sh@29 -- # port=4419 00:08:26.738 10:39:43 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:08:26.738 10:39:43 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4419' 00:08:26.738 10:39:43 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4419"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:26.738 10:39:43 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4419' -c /tmp/fuzz_json_19.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 -Z 19 -r /var/tmp/spdk19.sock 00:08:26.738 [2024-07-13 10:39:43.064962] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:08:26.738 [2024-07-13 10:39:43.065041] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1997902 ] 00:08:26.738 EAL: No free 2048 kB hugepages reported on node 1 00:08:26.997 [2024-07-13 10:39:43.244489] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:26.997 [2024-07-13 10:39:43.263818] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:26.997 [2024-07-13 10:39:43.263939] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:26.997 [2024-07-13 10:39:43.315351] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:26.997 [2024-07-13 10:39:43.331659] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4419 *** 00:08:26.997 INFO: Running with entropic power schedule (0xFF, 100). 00:08:26.997 INFO: Seed: 2409588547 00:08:26.997 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x26ac74c, 0x26ffca9), 00:08:26.997 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x26ffcb0,0x2c35280), 00:08:26.997 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:08:26.997 INFO: A corpus is not provided, starting from an empty corpus 00:08:26.997 #2 INITED exec/s: 0 rss: 60Mb 00:08:26.997 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:26.997 This may also happen if the target rejected all inputs we tried so far 00:08:26.997 [2024-07-13 10:39:43.380758] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 00:08:26.997 [2024-07-13 10:39:43.380788] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.514 NEW_FUNC[1/670]: 0x4bec40 in fuzz_nvm_write_uncorrectable_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:582 00:08:27.514 NEW_FUNC[2/670]: 0x4dac50 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:27.514 #29 NEW cov: 11535 ft: 11533 corp: 2/16b lim: 50 exec/s: 0 rss: 68Mb L: 15/15 MS: 2 ChangeBit-InsertRepeatedBytes- 00:08:27.514 [2024-07-13 10:39:43.691470] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 00:08:27.514 [2024-07-13 10:39:43.691502] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.514 #30 NEW cov: 11650 ft: 12065 corp: 3/35b lim: 50 exec/s: 0 rss: 68Mb L: 19/19 MS: 1 CopyPart- 00:08:27.514 [2024-07-13 10:39:43.731956] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:5280832617179597129 len:18762 00:08:27.514 [2024-07-13 10:39:43.731985] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.514 [2024-07-13 10:39:43.732021] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:5280832617179597129 len:18762 00:08:27.515 [2024-07-13 10:39:43.732035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.515 [2024-07-13 10:39:43.732086] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:5280832617179597129 len:18762 00:08:27.515 [2024-07-13 10:39:43.732100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.515 [2024-07-13 10:39:43.732151] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:5280832617179597129 len:18762 00:08:27.515 [2024-07-13 10:39:43.732166] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:27.515 [2024-07-13 10:39:43.732216] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:4 nsid:0 lba:5280832617179597129 len:6683 00:08:27.515 [2024-07-13 10:39:43.732232] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:27.515 #39 NEW cov: 11656 ft: 12701 corp: 4/85b lim: 50 exec/s: 0 rss: 68Mb L: 50/50 MS: 4 ChangeBit-ShuffleBytes-CopyPart-InsertRepeatedBytes- 00:08:27.515 [2024-07-13 10:39:43.771644] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446730879570018303 len:65536 00:08:27.515 [2024-07-13 10:39:43.771673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.515 #40 NEW cov: 11741 ft: 13054 corp: 5/100b lim: 50 exec/s: 0 rss: 68Mb L: 15/50 MS: 1 ChangeByte- 00:08:27.515 [2024-07-13 10:39:43.811983] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 00:08:27.515 [2024-07-13 10:39:43.812010] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.515 [2024-07-13 10:39:43.812047] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:08:27.515 [2024-07-13 10:39:43.812063] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.515 [2024-07-13 10:39:43.812116] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65524 00:08:27.515 [2024-07-13 10:39:43.812133] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.515 #41 NEW cov: 11741 ft: 13443 corp: 6/138b lim: 50 exec/s: 0 rss: 68Mb L: 38/50 MS: 1 InsertRepeatedBytes- 00:08:27.515 [2024-07-13 10:39:43.852007] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2821266740684990247 len:10024 00:08:27.515 [2024-07-13 10:39:43.852034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.515 [2024-07-13 10:39:43.852087] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744070071517183 len:65536 00:08:27.515 [2024-07-13 10:39:43.852103] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.515 #42 NEW cov: 11741 ft: 13709 corp: 7/165b lim: 50 exec/s: 0 rss: 68Mb L: 27/50 MS: 1 InsertRepeatedBytes- 00:08:27.515 [2024-07-13 10:39:43.892286] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3906369333256140342 len:13879 00:08:27.515 [2024-07-13 10:39:43.892313] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.515 [2024-07-13 10:39:43.892355] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:3906369333256140342 len:13879 00:08:27.515 [2024-07-13 10:39:43.892368] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.515 [2024-07-13 10:39:43.892419] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:3906369333256140342 len:13879 00:08:27.515 [2024-07-13 10:39:43.892435] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.515 [2024-07-13 10:39:43.892489] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:3906369333256140342 len:13879 00:08:27.515 [2024-07-13 10:39:43.892503] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:27.774 #43 NEW cov: 11741 ft: 13769 corp: 8/211b lim: 50 exec/s: 0 rss: 68Mb L: 46/50 MS: 1 InsertRepeatedBytes- 00:08:27.774 [2024-07-13 10:39:43.932279] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 00:08:27.774 [2024-07-13 10:39:43.932307] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.774 [2024-07-13 10:39:43.932347] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18376094954066804735 len:65536 00:08:27.774 [2024-07-13 10:39:43.932362] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.774 [2024-07-13 10:39:43.932414] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65524 00:08:27.774 [2024-07-13 10:39:43.932429] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.774 #44 NEW cov: 11741 ft: 13802 corp: 9/249b lim: 50 exec/s: 0 rss: 68Mb L: 38/50 MS: 1 ChangeBinInt- 00:08:27.774 [2024-07-13 10:39:43.972171] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 00:08:27.774 [2024-07-13 10:39:43.972198] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.774 #45 NEW cov: 11741 ft: 13864 corp: 10/264b lim: 50 exec/s: 0 rss: 68Mb L: 15/50 MS: 1 ShuffleBytes- 00:08:27.774 [2024-07-13 10:39:44.002289] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65504 00:08:27.774 [2024-07-13 10:39:44.002316] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.774 #46 NEW cov: 11741 ft: 13962 corp: 11/279b lim: 50 exec/s: 0 rss: 69Mb L: 15/50 MS: 1 ChangeBit- 00:08:27.774 [2024-07-13 10:39:44.032707] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:0 len:1 00:08:27.774 [2024-07-13 10:39:44.032735] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.774 [2024-07-13 10:39:44.032774] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:08:27.774 [2024-07-13 10:39:44.032789] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.774 [2024-07-13 10:39:44.032838] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:1 00:08:27.774 [2024-07-13 10:39:44.032853] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.774 [2024-07-13 10:39:44.032905] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:0 len:1 00:08:27.774 [2024-07-13 10:39:44.032919] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:27.774 #49 NEW cov: 11741 ft: 13979 corp: 12/324b lim: 50 exec/s: 0 rss: 69Mb L: 45/50 MS: 3 CrossOver-ShuffleBytes-InsertRepeatedBytes- 00:08:27.774 [2024-07-13 10:39:44.062524] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2821266740684990247 len:10024 00:08:27.774 [2024-07-13 10:39:44.062552] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.774 [2024-07-13 10:39:44.062595] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744070071517183 len:65295 00:08:27.774 [2024-07-13 10:39:44.062610] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.774 #50 NEW cov: 11741 ft: 14048 corp: 13/344b lim: 50 exec/s: 0 rss: 69Mb L: 20/50 MS: 1 EraseBytes- 00:08:27.774 [2024-07-13 10:39:44.102777] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2821266740684990247 len:10024 00:08:27.774 [2024-07-13 10:39:44.102806] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.774 [2024-07-13 10:39:44.102842] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744070071517183 len:65295 00:08:27.774 [2024-07-13 10:39:44.102859] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.774 [2024-07-13 10:39:44.102914] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65504 00:08:27.774 [2024-07-13 10:39:44.102930] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.774 #51 NEW cov: 11741 ft: 14085 corp: 14/379b lim: 50 exec/s: 0 rss: 69Mb L: 35/50 MS: 1 CrossOver- 00:08:27.774 [2024-07-13 10:39:44.142983] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:0 len:1 00:08:27.774 [2024-07-13 10:39:44.143010] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.774 [2024-07-13 10:39:44.143052] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:08:27.774 [2024-07-13 10:39:44.143068] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.774 [2024-07-13 10:39:44.143123] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:1 00:08:27.774 [2024-07-13 10:39:44.143138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.774 [2024-07-13 10:39:44.143187] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:0 len:1 00:08:27.774 [2024-07-13 10:39:44.143202] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:28.034 #52 NEW cov: 11741 ft: 14108 corp: 15/426b lim: 50 exec/s: 0 rss: 69Mb L: 47/50 MS: 1 CopyPart- 00:08:28.034 [2024-07-13 10:39:44.183125] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:0 len:1 00:08:28.034 [2024-07-13 10:39:44.183155] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.034 [2024-07-13 10:39:44.183191] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:08:28.034 [2024-07-13 10:39:44.183208] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.034 [2024-07-13 10:39:44.183261] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:1 00:08:28.034 [2024-07-13 10:39:44.183275] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.034 [2024-07-13 10:39:44.183329] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:0 len:1 00:08:28.034 [2024-07-13 10:39:44.183344] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:28.034 #53 NEW cov: 11741 ft: 14133 corp: 16/471b lim: 50 exec/s: 0 rss: 69Mb L: 45/50 MS: 1 CopyPart- 00:08:28.034 [2024-07-13 10:39:44.222953] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 00:08:28.034 [2024-07-13 10:39:44.222981] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.034 #54 NEW cov: 11741 ft: 14150 corp: 17/490b lim: 50 exec/s: 0 rss: 69Mb L: 19/50 MS: 1 ChangeBit- 00:08:28.034 [2024-07-13 10:39:44.263044] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446655013267701759 len:65536 00:08:28.034 [2024-07-13 10:39:44.263070] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.034 NEW_FUNC[1/1]: 0x197bcf0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:28.034 #55 NEW cov: 11764 ft: 14205 corp: 18/506b lim: 50 exec/s: 0 rss: 69Mb L: 16/50 MS: 1 InsertByte- 00:08:28.034 [2024-07-13 10:39:44.303501] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2821266740684990247 len:10024 00:08:28.034 [2024-07-13 10:39:44.303529] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.034 [2024-07-13 10:39:44.303569] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744070071517183 len:65295 00:08:28.034 [2024-07-13 10:39:44.303584] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.034 [2024-07-13 10:39:44.303636] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:4294901760 len:1 00:08:28.034 [2024-07-13 10:39:44.303652] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.034 [2024-07-13 10:39:44.303705] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18446744069414584575 len:65536 00:08:28.034 [2024-07-13 10:39:44.303722] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:28.034 #56 NEW cov: 11764 ft: 14299 corp: 19/552b lim: 50 exec/s: 0 rss: 69Mb L: 46/50 MS: 1 InsertRepeatedBytes- 00:08:28.034 [2024-07-13 10:39:44.343394] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2821266740684990247 len:10024 00:08:28.034 [2024-07-13 10:39:44.343421] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.034 [2024-07-13 10:39:44.343476] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744070071517183 len:65536 00:08:28.034 [2024-07-13 10:39:44.343492] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.034 #57 NEW cov: 11764 ft: 14331 corp: 20/579b lim: 50 exec/s: 0 rss: 69Mb L: 27/50 MS: 1 ShuffleBytes- 00:08:28.034 [2024-07-13 10:39:44.383564] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2821266740684990247 len:10024 00:08:28.034 [2024-07-13 10:39:44.383592] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.034 [2024-07-13 10:39:44.383642] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744070071517183 len:65536 00:08:28.034 [2024-07-13 10:39:44.383658] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.034 #58 NEW cov: 11764 ft: 14337 corp: 21/606b lim: 50 exec/s: 58 rss: 69Mb L: 27/50 MS: 1 ChangeByte- 00:08:28.293 [2024-07-13 10:39:44.423886] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:0 len:1 00:08:28.293 [2024-07-13 10:39:44.423915] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.293 [2024-07-13 10:39:44.423952] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:08:28.293 [2024-07-13 10:39:44.423969] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.293 [2024-07-13 10:39:44.424021] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:1 00:08:28.293 [2024-07-13 10:39:44.424037] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.293 [2024-07-13 10:39:44.424089] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:0 len:256 00:08:28.293 [2024-07-13 10:39:44.424104] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:28.293 #59 NEW cov: 11764 ft: 14377 corp: 22/651b lim: 50 exec/s: 59 rss: 70Mb L: 45/50 MS: 1 CopyPart- 00:08:28.293 [2024-07-13 10:39:44.464005] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744070041698303 len:65536 00:08:28.293 [2024-07-13 10:39:44.464033] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.293 [2024-07-13 10:39:44.464070] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:08:28.293 [2024-07-13 10:39:44.464086] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.293 [2024-07-13 10:39:44.464140] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 00:08:28.293 [2024-07-13 10:39:44.464157] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.293 [2024-07-13 10:39:44.464212] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 00:08:28.293 [2024-07-13 10:39:44.464227] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:28.293 #63 NEW cov: 11764 ft: 14448 corp: 23/696b lim: 50 exec/s: 63 rss: 70Mb L: 45/50 MS: 4 CopyPart-InsertByte-ChangeByte-InsertRepeatedBytes- 00:08:28.294 [2024-07-13 10:39:44.504051] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 00:08:28.294 [2024-07-13 10:39:44.504079] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.294 [2024-07-13 10:39:44.504114] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:08:28.294 [2024-07-13 10:39:44.504129] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.294 [2024-07-13 10:39:44.504181] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65524 00:08:28.294 [2024-07-13 10:39:44.504195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.294 #64 NEW cov: 11764 ft: 14461 corp: 24/734b lim: 50 exec/s: 64 rss: 70Mb L: 38/50 MS: 1 CopyPart- 00:08:28.294 [2024-07-13 10:39:44.544222] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3906369333256140342 len:13879 00:08:28.294 [2024-07-13 10:39:44.544249] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.294 [2024-07-13 10:39:44.544284] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:3906369333256140342 len:13879 00:08:28.294 [2024-07-13 10:39:44.544300] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.294 [2024-07-13 10:39:44.544353] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:3906369333272917558 len:13879 00:08:28.294 [2024-07-13 10:39:44.544367] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.294 [2024-07-13 10:39:44.544419] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:3906369333256140342 len:13879 00:08:28.294 [2024-07-13 10:39:44.544434] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:28.294 #65 NEW cov: 11764 ft: 14505 corp: 25/780b lim: 50 exec/s: 65 rss: 70Mb L: 46/50 MS: 1 ChangeBit- 00:08:28.294 [2024-07-13 10:39:44.584131] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2821264541661734695 len:10024 00:08:28.294 [2024-07-13 10:39:44.584159] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.294 [2024-07-13 10:39:44.584208] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744070071517183 len:65295 00:08:28.294 [2024-07-13 10:39:44.584224] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.294 #66 NEW cov: 11764 ft: 14510 corp: 26/800b lim: 50 exec/s: 66 rss: 70Mb L: 20/50 MS: 1 ChangeByte- 00:08:28.294 [2024-07-13 10:39:44.624103] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 00:08:28.294 [2024-07-13 10:39:44.624131] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.294 #67 NEW cov: 11764 ft: 14526 corp: 27/819b lim: 50 exec/s: 67 rss: 70Mb L: 19/50 MS: 1 CopyPart- 00:08:28.294 [2024-07-13 10:39:44.654554] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744070041698303 len:65536 00:08:28.294 [2024-07-13 10:39:44.654581] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.294 [2024-07-13 10:39:44.654616] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:08:28.294 [2024-07-13 10:39:44.654631] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.294 [2024-07-13 10:39:44.654683] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 00:08:28.294 [2024-07-13 10:39:44.654698] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.294 [2024-07-13 10:39:44.654749] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 00:08:28.294 [2024-07-13 10:39:44.654764] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:28.553 #68 NEW cov: 11764 ft: 14530 corp: 28/864b lim: 50 exec/s: 68 rss: 70Mb L: 45/50 MS: 1 ShuffleBytes- 00:08:28.553 [2024-07-13 10:39:44.694710] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3906369333256140342 len:13879 00:08:28.553 [2024-07-13 10:39:44.694738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.553 [2024-07-13 10:39:44.694788] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:3906369333256140342 len:13879 00:08:28.553 [2024-07-13 10:39:44.694804] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.553 [2024-07-13 10:39:44.694852] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:3906369333256205878 len:13879 00:08:28.553 [2024-07-13 10:39:44.694868] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.553 [2024-07-13 10:39:44.694919] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:3906369333256140342 len:13879 00:08:28.553 [2024-07-13 10:39:44.694935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:28.553 #69 NEW cov: 11764 ft: 14539 corp: 29/910b lim: 50 exec/s: 69 rss: 70Mb L: 46/50 MS: 1 ChangeBit- 00:08:28.553 [2024-07-13 10:39:44.734535] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 00:08:28.553 [2024-07-13 10:39:44.734561] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.553 [2024-07-13 10:39:44.734610] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:08:28.553 [2024-07-13 10:39:44.734625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.553 #70 NEW cov: 11764 ft: 14597 corp: 30/937b lim: 50 exec/s: 70 rss: 70Mb L: 27/50 MS: 1 CMP- DE: "\377\377\377\377\377\377\377\377"- 00:08:28.553 [2024-07-13 10:39:44.774906] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744070041698303 len:65536 00:08:28.553 [2024-07-13 10:39:44.774933] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.553 [2024-07-13 10:39:44.774969] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:08:28.553 [2024-07-13 10:39:44.774987] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.553 [2024-07-13 10:39:44.775036] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:61440 00:08:28.553 [2024-07-13 10:39:44.775051] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.553 [2024-07-13 10:39:44.775121] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 00:08:28.553 [2024-07-13 10:39:44.775136] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:28.553 #71 NEW cov: 11764 ft: 14618 corp: 31/982b lim: 50 exec/s: 71 rss: 70Mb L: 45/50 MS: 1 ChangeBit- 00:08:28.553 [2024-07-13 10:39:44.814906] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 00:08:28.553 [2024-07-13 10:39:44.814933] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.553 [2024-07-13 10:39:44.814968] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709496063 len:65536 00:08:28.553 [2024-07-13 10:39:44.814983] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.553 [2024-07-13 10:39:44.815034] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65524 00:08:28.553 [2024-07-13 10:39:44.815049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.553 #72 NEW cov: 11764 ft: 14679 corp: 32/1020b lim: 50 exec/s: 72 rss: 70Mb L: 38/50 MS: 1 ChangeBinInt- 00:08:28.553 [2024-07-13 10:39:44.854842] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446743523953737727 len:65295 00:08:28.553 [2024-07-13 10:39:44.854869] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.553 #73 NEW cov: 11764 ft: 14683 corp: 33/1030b lim: 50 exec/s: 73 rss: 70Mb L: 10/50 MS: 1 EraseBytes- 00:08:28.553 [2024-07-13 10:39:44.895051] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 00:08:28.553 [2024-07-13 10:39:44.895077] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.553 [2024-07-13 10:39:44.895129] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446742978492891135 len:256 00:08:28.553 [2024-07-13 10:39:44.895145] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.553 #74 NEW cov: 11764 ft: 14705 corp: 34/1052b lim: 50 exec/s: 74 rss: 70Mb L: 22/50 MS: 1 CrossOver- 00:08:28.553 [2024-07-13 10:39:44.935395] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 00:08:28.553 [2024-07-13 10:39:44.935422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.553 [2024-07-13 10:39:44.935464] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:08:28.553 [2024-07-13 10:39:44.935480] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.553 [2024-07-13 10:39:44.935532] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 00:08:28.553 [2024-07-13 10:39:44.935550] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.553 [2024-07-13 10:39:44.935603] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18446744022169944063 len:65536 00:08:28.553 [2024-07-13 10:39:44.935616] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:28.812 #75 NEW cov: 11764 ft: 14713 corp: 35/1098b lim: 50 exec/s: 75 rss: 70Mb L: 46/50 MS: 1 PersAutoDict- DE: "\377\377\377\377\377\377\377\377"- 00:08:28.812 [2024-07-13 10:39:44.975302] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744069518796287 len:65536 00:08:28.812 [2024-07-13 10:39:44.975330] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.812 [2024-07-13 10:39:44.975372] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:08:28.812 [2024-07-13 10:39:44.975387] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.812 #79 NEW cov: 11764 ft: 14744 corp: 36/1127b lim: 50 exec/s: 79 rss: 70Mb L: 29/50 MS: 4 CrossOver-CrossOver-ChangeBinInt-CrossOver- 00:08:28.812 [2024-07-13 10:39:45.015653] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3906369333256140342 len:13879 00:08:28.812 [2024-07-13 10:39:45.015680] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.812 [2024-07-13 10:39:45.015716] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:3906369333256140342 len:13879 00:08:28.812 [2024-07-13 10:39:45.015746] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.812 [2024-07-13 10:39:45.015799] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:3906309860843992630 len:13879 00:08:28.812 [2024-07-13 10:39:45.015814] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.812 [2024-07-13 10:39:45.015868] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:3906369333256140342 len:13879 00:08:28.812 [2024-07-13 10:39:45.015883] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:28.812 #80 NEW cov: 11764 ft: 14762 corp: 37/1173b lim: 50 exec/s: 80 rss: 70Mb L: 46/50 MS: 1 CMP- DE: "\000\037"- 00:08:28.812 [2024-07-13 10:39:45.045679] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:0 len:64001 00:08:28.812 [2024-07-13 10:39:45.045707] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.812 [2024-07-13 10:39:45.045746] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:08:28.812 [2024-07-13 10:39:45.045762] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.812 [2024-07-13 10:39:45.045805] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:1 00:08:28.812 [2024-07-13 10:39:45.045820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.812 [2024-07-13 10:39:45.045889] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:0 len:1 00:08:28.812 [2024-07-13 10:39:45.045904] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:28.812 #81 NEW cov: 11764 ft: 14770 corp: 38/1220b lim: 50 exec/s: 81 rss: 70Mb L: 47/50 MS: 1 ChangeBinInt- 00:08:28.812 [2024-07-13 10:39:45.085568] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:58624 00:08:28.812 [2024-07-13 10:39:45.085595] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.812 [2024-07-13 10:39:45.085633] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:1 00:08:28.812 [2024-07-13 10:39:45.085649] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.812 #82 NEW cov: 11764 ft: 14814 corp: 39/1243b lim: 50 exec/s: 82 rss: 70Mb L: 23/50 MS: 1 InsertByte- 00:08:28.812 [2024-07-13 10:39:45.125746] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2821264541661734695 len:10024 00:08:28.812 [2024-07-13 10:39:45.125775] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.812 [2024-07-13 10:39:45.125826] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:2882303757879027495 len:65536 00:08:28.812 [2024-07-13 10:39:45.125841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.812 #83 NEW cov: 11764 ft: 14839 corp: 40/1266b lim: 50 exec/s: 83 rss: 70Mb L: 23/50 MS: 1 CopyPart- 00:08:28.812 [2024-07-13 10:39:45.166078] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 00:08:28.812 [2024-07-13 10:39:45.166106] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.812 [2024-07-13 10:39:45.166130] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:08:28.812 [2024-07-13 10:39:45.166142] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.812 [2024-07-13 10:39:45.166192] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 00:08:28.812 [2024-07-13 10:39:45.166207] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.812 [2024-07-13 10:39:45.166259] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 00:08:28.812 [2024-07-13 10:39:45.166274] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:28.813 #85 NEW cov: 11764 ft: 14848 corp: 41/1312b lim: 50 exec/s: 85 rss: 70Mb L: 46/50 MS: 2 PersAutoDict-InsertRepeatedBytes- DE: "\377\377\377\377\377\377\377\377"- 00:08:28.813 [2024-07-13 10:39:45.195923] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2821266740684990247 len:10034 00:08:28.813 [2024-07-13 10:39:45.195952] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.813 [2024-07-13 10:39:45.196007] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744070071517183 len:65536 00:08:28.813 [2024-07-13 10:39:45.196024] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.071 #86 NEW cov: 11764 ft: 14861 corp: 42/1339b lim: 50 exec/s: 86 rss: 70Mb L: 27/50 MS: 1 ChangeByte- 00:08:29.071 [2024-07-13 10:39:45.236046] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2821264541661734695 len:10240 00:08:29.071 [2024-07-13 10:39:45.236074] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.071 [2024-07-13 10:39:45.236133] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073695395839 len:65295 00:08:29.071 [2024-07-13 10:39:45.236149] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.071 #87 NEW cov: 11764 ft: 14929 corp: 43/1359b lim: 50 exec/s: 87 rss: 70Mb L: 20/50 MS: 1 CrossOver- 00:08:29.071 [2024-07-13 10:39:45.276023] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18374673285532090367 len:65536 00:08:29.071 [2024-07-13 10:39:45.276050] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.072 #88 NEW cov: 11764 ft: 14940 corp: 44/1374b lim: 50 exec/s: 88 rss: 70Mb L: 15/50 MS: 1 ChangeBit- 00:08:29.072 [2024-07-13 10:39:45.316459] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:0 len:1 00:08:29.072 [2024-07-13 10:39:45.316485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.072 [2024-07-13 10:39:45.316534] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:08:29.072 [2024-07-13 10:39:45.316551] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.072 [2024-07-13 10:39:45.316602] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:268435456 len:1 00:08:29.072 [2024-07-13 10:39:45.316634] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.072 [2024-07-13 10:39:45.316685] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:0 len:1 00:08:29.072 [2024-07-13 10:39:45.316701] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:29.072 #89 NEW cov: 11764 ft: 14950 corp: 45/1421b lim: 50 exec/s: 89 rss: 70Mb L: 47/50 MS: 1 ChangeBit- 00:08:29.072 [2024-07-13 10:39:45.356268] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 00:08:29.072 [2024-07-13 10:39:45.356295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.072 #90 NEW cov: 11764 ft: 14955 corp: 46/1440b lim: 50 exec/s: 45 rss: 70Mb L: 19/50 MS: 1 ChangeBinInt- 00:08:29.072 #90 DONE cov: 11764 ft: 14955 corp: 46/1440b lim: 50 exec/s: 45 rss: 70Mb 00:08:29.072 ###### Recommended dictionary. ###### 00:08:29.072 "\377\377\377\377\377\377\377\377" # Uses: 2 00:08:29.072 "\000\037" # Uses: 0 00:08:29.072 ###### End of recommended dictionary. ###### 00:08:29.072 Done 90 runs in 2 second(s) 00:08:29.331 10:39:45 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_19.conf 00:08:29.331 10:39:45 -- ../common.sh@72 -- # (( i++ )) 00:08:29.331 10:39:45 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:29.331 10:39:45 -- ../common.sh@73 -- # start_llvm_fuzz 20 1 0x1 00:08:29.331 10:39:45 -- nvmf/run.sh@23 -- # local fuzzer_type=20 00:08:29.331 10:39:45 -- nvmf/run.sh@24 -- # local timen=1 00:08:29.331 10:39:45 -- nvmf/run.sh@25 -- # local core=0x1 00:08:29.331 10:39:45 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:08:29.331 10:39:45 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_20.conf 00:08:29.331 10:39:45 -- nvmf/run.sh@29 -- # printf %02d 20 00:08:29.331 10:39:45 -- nvmf/run.sh@29 -- # port=4420 00:08:29.331 10:39:45 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:08:29.331 10:39:45 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4420' 00:08:29.331 10:39:45 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4420"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:29.331 10:39:45 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4420' -c /tmp/fuzz_json_20.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 -Z 20 -r /var/tmp/spdk20.sock 00:08:29.331 [2024-07-13 10:39:45.522141] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:08:29.331 [2024-07-13 10:39:45.522210] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1998261 ] 00:08:29.331 EAL: No free 2048 kB hugepages reported on node 1 00:08:29.331 [2024-07-13 10:39:45.700465] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:29.590 [2024-07-13 10:39:45.720090] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:29.590 [2024-07-13 10:39:45.720215] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:29.590 [2024-07-13 10:39:45.771889] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:29.590 [2024-07-13 10:39:45.788177] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4420 *** 00:08:29.590 INFO: Running with entropic power schedule (0xFF, 100). 00:08:29.590 INFO: Seed: 571621792 00:08:29.590 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x26ac74c, 0x26ffca9), 00:08:29.590 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x26ffcb0,0x2c35280), 00:08:29.590 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:08:29.590 INFO: A corpus is not provided, starting from an empty corpus 00:08:29.590 #2 INITED exec/s: 0 rss: 60Mb 00:08:29.590 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:29.590 This may also happen if the target rejected all inputs we tried so far 00:08:29.590 [2024-07-13 10:39:45.854871] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:29.590 [2024-07-13 10:39:45.854907] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.590 [2024-07-13 10:39:45.855045] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:29.590 [2024-07-13 10:39:45.855070] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.590 [2024-07-13 10:39:45.855199] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:29.590 [2024-07-13 10:39:45.855222] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.590 [2024-07-13 10:39:45.855342] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:29.590 [2024-07-13 10:39:45.855363] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:29.848 NEW_FUNC[1/672]: 0x4c0800 in fuzz_nvm_reservation_acquire_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:597 00:08:29.848 NEW_FUNC[2/672]: 0x4dac50 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:29.848 #17 NEW cov: 11595 ft: 11596 corp: 2/85b lim: 90 exec/s: 0 rss: 68Mb L: 84/84 MS: 5 ShuffleBytes-CrossOver-ChangeBinInt-CopyPart-InsertRepeatedBytes- 00:08:29.848 [2024-07-13 10:39:46.195685] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:29.848 [2024-07-13 10:39:46.195740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.848 [2024-07-13 10:39:46.195884] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:29.848 [2024-07-13 10:39:46.195910] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.848 [2024-07-13 10:39:46.196050] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:29.848 [2024-07-13 10:39:46.196078] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.848 [2024-07-13 10:39:46.196208] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:29.848 [2024-07-13 10:39:46.196233] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:29.848 #23 NEW cov: 11708 ft: 12133 corp: 3/169b lim: 90 exec/s: 0 rss: 68Mb L: 84/84 MS: 1 ShuffleBytes- 00:08:30.107 [2024-07-13 10:39:46.245648] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:30.107 [2024-07-13 10:39:46.245678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.107 [2024-07-13 10:39:46.245800] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:30.107 [2024-07-13 10:39:46.245821] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.107 [2024-07-13 10:39:46.245933] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:30.107 [2024-07-13 10:39:46.245954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.107 [2024-07-13 10:39:46.246074] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:30.107 [2024-07-13 10:39:46.246098] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:30.107 #24 NEW cov: 11714 ft: 12377 corp: 4/253b lim: 90 exec/s: 0 rss: 68Mb L: 84/84 MS: 1 ChangeBit- 00:08:30.107 [2024-07-13 10:39:46.285660] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:30.107 [2024-07-13 10:39:46.285694] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.107 [2024-07-13 10:39:46.285803] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:30.107 [2024-07-13 10:39:46.285825] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.107 [2024-07-13 10:39:46.285940] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:30.107 [2024-07-13 10:39:46.285966] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.107 [2024-07-13 10:39:46.286089] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:30.107 [2024-07-13 10:39:46.286108] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:30.107 #25 NEW cov: 11799 ft: 12644 corp: 5/334b lim: 90 exec/s: 0 rss: 68Mb L: 81/84 MS: 1 CrossOver- 00:08:30.107 [2024-07-13 10:39:46.335834] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:30.107 [2024-07-13 10:39:46.335863] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.107 [2024-07-13 10:39:46.335985] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:30.107 [2024-07-13 10:39:46.336011] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.107 [2024-07-13 10:39:46.336129] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:30.107 [2024-07-13 10:39:46.336148] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.107 [2024-07-13 10:39:46.336274] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:30.107 [2024-07-13 10:39:46.336298] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:30.107 #26 NEW cov: 11799 ft: 12811 corp: 6/418b lim: 90 exec/s: 0 rss: 68Mb L: 84/84 MS: 1 ChangeBit- 00:08:30.107 [2024-07-13 10:39:46.376029] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:30.107 [2024-07-13 10:39:46.376061] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.107 [2024-07-13 10:39:46.376196] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:30.107 [2024-07-13 10:39:46.376218] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.107 [2024-07-13 10:39:46.376334] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:30.107 [2024-07-13 10:39:46.376360] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.107 [2024-07-13 10:39:46.376486] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:30.107 [2024-07-13 10:39:46.376508] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:30.107 #27 NEW cov: 11799 ft: 12899 corp: 7/502b lim: 90 exec/s: 0 rss: 68Mb L: 84/84 MS: 1 ChangeByte- 00:08:30.107 [2024-07-13 10:39:46.426193] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:30.107 [2024-07-13 10:39:46.426227] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.107 [2024-07-13 10:39:46.426328] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:30.107 [2024-07-13 10:39:46.426349] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.107 [2024-07-13 10:39:46.426471] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:30.107 [2024-07-13 10:39:46.426508] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.107 [2024-07-13 10:39:46.426627] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:30.107 [2024-07-13 10:39:46.426654] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:30.107 #28 NEW cov: 11799 ft: 12955 corp: 8/586b lim: 90 exec/s: 0 rss: 68Mb L: 84/84 MS: 1 ChangeBit- 00:08:30.107 [2024-07-13 10:39:46.466490] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:30.107 [2024-07-13 10:39:46.466520] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.107 [2024-07-13 10:39:46.466601] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:30.107 [2024-07-13 10:39:46.466624] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.107 [2024-07-13 10:39:46.466743] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:30.107 [2024-07-13 10:39:46.466765] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.107 [2024-07-13 10:39:46.466887] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:30.107 [2024-07-13 10:39:46.466907] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:30.107 [2024-07-13 10:39:46.467030] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:4 nsid:0 00:08:30.107 [2024-07-13 10:39:46.467052] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:30.107 #29 NEW cov: 11799 ft: 13077 corp: 9/676b lim: 90 exec/s: 0 rss: 69Mb L: 90/90 MS: 1 CrossOver- 00:08:30.366 [2024-07-13 10:39:46.506333] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:30.366 [2024-07-13 10:39:46.506364] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.366 [2024-07-13 10:39:46.506472] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:30.366 [2024-07-13 10:39:46.506498] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.366 [2024-07-13 10:39:46.506619] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:30.366 [2024-07-13 10:39:46.506637] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.366 [2024-07-13 10:39:46.506764] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:30.366 [2024-07-13 10:39:46.506789] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:30.366 #30 NEW cov: 11799 ft: 13100 corp: 10/760b lim: 90 exec/s: 0 rss: 69Mb L: 84/90 MS: 1 ShuffleBytes- 00:08:30.366 [2024-07-13 10:39:46.546446] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:30.366 [2024-07-13 10:39:46.546481] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.366 [2024-07-13 10:39:46.546581] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:30.366 [2024-07-13 10:39:46.546603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.366 [2024-07-13 10:39:46.546716] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:30.366 [2024-07-13 10:39:46.546732] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.366 [2024-07-13 10:39:46.546845] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:30.366 [2024-07-13 10:39:46.546866] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:30.366 #31 NEW cov: 11799 ft: 13126 corp: 11/844b lim: 90 exec/s: 0 rss: 69Mb L: 84/90 MS: 1 ChangeBinInt- 00:08:30.366 [2024-07-13 10:39:46.586656] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:30.366 [2024-07-13 10:39:46.586685] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.366 [2024-07-13 10:39:46.586791] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:30.367 [2024-07-13 10:39:46.586810] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.367 [2024-07-13 10:39:46.586923] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:30.367 [2024-07-13 10:39:46.586945] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.367 [2024-07-13 10:39:46.587057] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:30.367 [2024-07-13 10:39:46.587079] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:30.367 #32 NEW cov: 11799 ft: 13137 corp: 12/928b lim: 90 exec/s: 0 rss: 69Mb L: 84/90 MS: 1 ChangeBinInt- 00:08:30.367 [2024-07-13 10:39:46.626771] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:30.367 [2024-07-13 10:39:46.626801] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.367 [2024-07-13 10:39:46.626898] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:30.367 [2024-07-13 10:39:46.626918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.367 [2024-07-13 10:39:46.627038] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:30.367 [2024-07-13 10:39:46.627059] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.367 [2024-07-13 10:39:46.627173] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:30.367 [2024-07-13 10:39:46.627196] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:30.367 #33 NEW cov: 11799 ft: 13159 corp: 13/1012b lim: 90 exec/s: 0 rss: 69Mb L: 84/90 MS: 1 CopyPart- 00:08:30.367 [2024-07-13 10:39:46.666879] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:30.367 [2024-07-13 10:39:46.666910] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.367 [2024-07-13 10:39:46.667007] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:30.367 [2024-07-13 10:39:46.667028] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.367 [2024-07-13 10:39:46.667148] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:30.367 [2024-07-13 10:39:46.667170] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.367 [2024-07-13 10:39:46.667289] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:30.367 [2024-07-13 10:39:46.667311] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:30.367 #34 NEW cov: 11799 ft: 13179 corp: 14/1101b lim: 90 exec/s: 0 rss: 69Mb L: 89/90 MS: 1 CopyPart- 00:08:30.367 [2024-07-13 10:39:46.706962] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:30.367 [2024-07-13 10:39:46.706995] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.367 [2024-07-13 10:39:46.707102] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:30.367 [2024-07-13 10:39:46.707121] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.367 [2024-07-13 10:39:46.707242] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:30.367 [2024-07-13 10:39:46.707266] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.367 [2024-07-13 10:39:46.707387] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:30.367 [2024-07-13 10:39:46.707410] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:30.367 NEW_FUNC[1/1]: 0x197bcf0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:30.367 #35 NEW cov: 11822 ft: 13261 corp: 15/1185b lim: 90 exec/s: 0 rss: 69Mb L: 84/90 MS: 1 ShuffleBytes- 00:08:30.367 [2024-07-13 10:39:46.747127] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:30.367 [2024-07-13 10:39:46.747159] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.367 [2024-07-13 10:39:46.747250] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:30.367 [2024-07-13 10:39:46.747273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.367 [2024-07-13 10:39:46.747386] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:30.367 [2024-07-13 10:39:46.747402] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.367 [2024-07-13 10:39:46.747527] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:30.367 [2024-07-13 10:39:46.747550] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:30.673 #36 NEW cov: 11822 ft: 13279 corp: 16/1266b lim: 90 exec/s: 0 rss: 69Mb L: 81/90 MS: 1 ShuffleBytes- 00:08:30.673 [2024-07-13 10:39:46.787201] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:30.673 [2024-07-13 10:39:46.787233] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.673 [2024-07-13 10:39:46.787335] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:30.673 [2024-07-13 10:39:46.787355] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.673 [2024-07-13 10:39:46.787472] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:30.673 [2024-07-13 10:39:46.787502] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.673 [2024-07-13 10:39:46.787614] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:30.673 [2024-07-13 10:39:46.787633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:30.673 #37 NEW cov: 11822 ft: 13324 corp: 17/1347b lim: 90 exec/s: 0 rss: 69Mb L: 81/90 MS: 1 CopyPart- 00:08:30.673 [2024-07-13 10:39:46.837331] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:30.673 [2024-07-13 10:39:46.837361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.673 [2024-07-13 10:39:46.837483] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:30.673 [2024-07-13 10:39:46.837504] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.673 [2024-07-13 10:39:46.837614] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:30.673 [2024-07-13 10:39:46.837643] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.673 [2024-07-13 10:39:46.837767] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:30.673 [2024-07-13 10:39:46.837790] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:30.673 #38 NEW cov: 11822 ft: 13333 corp: 18/1431b lim: 90 exec/s: 38 rss: 70Mb L: 84/90 MS: 1 CopyPart- 00:08:30.673 [2024-07-13 10:39:46.877465] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:30.673 [2024-07-13 10:39:46.877498] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.673 [2024-07-13 10:39:46.877599] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:30.673 [2024-07-13 10:39:46.877622] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.673 [2024-07-13 10:39:46.877748] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:30.673 [2024-07-13 10:39:46.877769] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.673 [2024-07-13 10:39:46.877885] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:30.673 [2024-07-13 10:39:46.877911] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:30.673 #39 NEW cov: 11822 ft: 13366 corp: 19/1517b lim: 90 exec/s: 39 rss: 70Mb L: 86/90 MS: 1 CopyPart- 00:08:30.673 [2024-07-13 10:39:46.917518] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:30.673 [2024-07-13 10:39:46.917551] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.673 [2024-07-13 10:39:46.917636] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:30.673 [2024-07-13 10:39:46.917661] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.673 [2024-07-13 10:39:46.917797] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:30.673 [2024-07-13 10:39:46.917821] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.673 [2024-07-13 10:39:46.917943] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:30.673 [2024-07-13 10:39:46.917962] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:30.673 #40 NEW cov: 11822 ft: 13376 corp: 20/1598b lim: 90 exec/s: 40 rss: 70Mb L: 81/90 MS: 1 CrossOver- 00:08:30.673 [2024-07-13 10:39:46.957769] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:30.673 [2024-07-13 10:39:46.957802] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.673 [2024-07-13 10:39:46.957933] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:30.674 [2024-07-13 10:39:46.957955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.674 [2024-07-13 10:39:46.958077] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:30.674 [2024-07-13 10:39:46.958100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.674 [2024-07-13 10:39:46.958226] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:30.674 [2024-07-13 10:39:46.958247] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:30.674 #41 NEW cov: 11822 ft: 13410 corp: 21/1683b lim: 90 exec/s: 41 rss: 70Mb L: 85/90 MS: 1 InsertByte- 00:08:30.674 [2024-07-13 10:39:46.997343] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:30.674 [2024-07-13 10:39:46.997370] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.674 [2024-07-13 10:39:46.997491] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:30.674 [2024-07-13 10:39:46.997515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.674 #42 NEW cov: 11822 ft: 13824 corp: 22/1731b lim: 90 exec/s: 42 rss: 70Mb L: 48/90 MS: 1 EraseBytes- 00:08:30.969 [2024-07-13 10:39:47.048339] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:30.969 [2024-07-13 10:39:47.048370] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.969 [2024-07-13 10:39:47.048506] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:30.969 [2024-07-13 10:39:47.048530] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.969 [2024-07-13 10:39:47.048651] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:30.969 [2024-07-13 10:39:47.048674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.969 [2024-07-13 10:39:47.048795] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:30.969 [2024-07-13 10:39:47.048820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:30.969 [2024-07-13 10:39:47.048944] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:4 nsid:0 00:08:30.969 [2024-07-13 10:39:47.048966] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:30.969 #43 NEW cov: 11822 ft: 13836 corp: 23/1821b lim: 90 exec/s: 43 rss: 70Mb L: 90/90 MS: 1 ChangeBinInt- 00:08:30.969 [2024-07-13 10:39:47.098186] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:30.969 [2024-07-13 10:39:47.098224] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.969 [2024-07-13 10:39:47.098350] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:30.969 [2024-07-13 10:39:47.098370] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.969 [2024-07-13 10:39:47.098490] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:30.969 [2024-07-13 10:39:47.098509] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.969 [2024-07-13 10:39:47.098641] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:30.969 [2024-07-13 10:39:47.098666] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:30.969 #44 NEW cov: 11822 ft: 13875 corp: 24/1902b lim: 90 exec/s: 44 rss: 70Mb L: 81/90 MS: 1 CrossOver- 00:08:30.969 [2024-07-13 10:39:47.137817] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:30.969 [2024-07-13 10:39:47.137841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.969 [2024-07-13 10:39:47.137968] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:30.969 [2024-07-13 10:39:47.137989] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.969 #45 NEW cov: 11822 ft: 13898 corp: 25/1954b lim: 90 exec/s: 45 rss: 70Mb L: 52/90 MS: 1 CMP- DE: "\377\377\377\011"- 00:08:30.969 [2024-07-13 10:39:47.178432] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:30.969 [2024-07-13 10:39:47.178467] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.969 [2024-07-13 10:39:47.178559] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:30.969 [2024-07-13 10:39:47.178579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.969 [2024-07-13 10:39:47.178701] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:30.969 [2024-07-13 10:39:47.178721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.969 [2024-07-13 10:39:47.178834] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:30.969 [2024-07-13 10:39:47.178856] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:30.969 #46 NEW cov: 11822 ft: 13922 corp: 26/2038b lim: 90 exec/s: 46 rss: 70Mb L: 84/90 MS: 1 ChangeBinInt- 00:08:30.969 [2024-07-13 10:39:47.218574] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:30.969 [2024-07-13 10:39:47.218605] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.969 [2024-07-13 10:39:47.218703] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:30.969 [2024-07-13 10:39:47.218725] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.969 [2024-07-13 10:39:47.218842] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:30.969 [2024-07-13 10:39:47.218863] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.969 [2024-07-13 10:39:47.218994] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:30.969 [2024-07-13 10:39:47.219021] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:30.969 #47 NEW cov: 11822 ft: 13926 corp: 27/2122b lim: 90 exec/s: 47 rss: 70Mb L: 84/90 MS: 1 ChangeBinInt- 00:08:30.969 [2024-07-13 10:39:47.258589] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:30.969 [2024-07-13 10:39:47.258624] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.969 [2024-07-13 10:39:47.258737] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:30.969 [2024-07-13 10:39:47.258763] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.969 [2024-07-13 10:39:47.258887] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:30.969 [2024-07-13 10:39:47.258906] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.969 [2024-07-13 10:39:47.259025] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:30.969 [2024-07-13 10:39:47.259050] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:30.969 #48 NEW cov: 11822 ft: 13951 corp: 28/2206b lim: 90 exec/s: 48 rss: 70Mb L: 84/90 MS: 1 ChangeByte- 00:08:30.969 [2024-07-13 10:39:47.308360] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:30.969 [2024-07-13 10:39:47.308392] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.969 [2024-07-13 10:39:47.308523] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:30.969 [2024-07-13 10:39:47.308548] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.969 #49 NEW cov: 11822 ft: 13983 corp: 29/2255b lim: 90 exec/s: 49 rss: 70Mb L: 49/90 MS: 1 InsertByte- 00:08:30.969 [2024-07-13 10:39:47.348813] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:30.969 [2024-07-13 10:39:47.348845] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.969 [2024-07-13 10:39:47.348951] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:30.969 [2024-07-13 10:39:47.348977] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.969 [2024-07-13 10:39:47.349100] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:30.969 [2024-07-13 10:39:47.349122] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.969 [2024-07-13 10:39:47.349242] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:30.969 [2024-07-13 10:39:47.349268] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:31.229 #50 NEW cov: 11822 ft: 14006 corp: 30/2341b lim: 90 exec/s: 50 rss: 70Mb L: 86/90 MS: 1 ChangeBit- 00:08:31.229 [2024-07-13 10:39:47.398372] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:31.229 [2024-07-13 10:39:47.398407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.229 [2024-07-13 10:39:47.398536] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:31.229 [2024-07-13 10:39:47.398561] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.229 [2024-07-13 10:39:47.398687] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:31.229 [2024-07-13 10:39:47.398713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.229 #51 NEW cov: 11822 ft: 14344 corp: 31/2407b lim: 90 exec/s: 51 rss: 70Mb L: 66/90 MS: 1 CrossOver- 00:08:31.229 [2024-07-13 10:39:47.439318] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:31.229 [2024-07-13 10:39:47.439349] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.229 [2024-07-13 10:39:47.439458] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:31.229 [2024-07-13 10:39:47.439483] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.229 [2024-07-13 10:39:47.439519] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:31.229 [2024-07-13 10:39:47.439543] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.229 [2024-07-13 10:39:47.439663] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:31.229 [2024-07-13 10:39:47.439686] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:31.229 [2024-07-13 10:39:47.439802] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:4 nsid:0 00:08:31.229 [2024-07-13 10:39:47.439827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:31.229 #52 NEW cov: 11822 ft: 14353 corp: 32/2497b lim: 90 exec/s: 52 rss: 70Mb L: 90/90 MS: 1 InsertByte- 00:08:31.229 [2024-07-13 10:39:47.489121] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:31.229 [2024-07-13 10:39:47.489159] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.229 [2024-07-13 10:39:47.489283] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:31.229 [2024-07-13 10:39:47.489304] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.229 [2024-07-13 10:39:47.489429] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:31.229 [2024-07-13 10:39:47.489461] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.229 [2024-07-13 10:39:47.489584] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:31.229 [2024-07-13 10:39:47.489606] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:31.229 #53 NEW cov: 11822 ft: 14366 corp: 33/2583b lim: 90 exec/s: 53 rss: 70Mb L: 86/90 MS: 1 ChangeByte- 00:08:31.229 [2024-07-13 10:39:47.539148] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:31.229 [2024-07-13 10:39:47.539184] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.229 [2024-07-13 10:39:47.539302] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:31.229 [2024-07-13 10:39:47.539324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.229 [2024-07-13 10:39:47.539435] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:31.229 [2024-07-13 10:39:47.539466] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.229 #54 NEW cov: 11822 ft: 14371 corp: 34/2651b lim: 90 exec/s: 54 rss: 70Mb L: 68/90 MS: 1 EraseBytes- 00:08:31.229 [2024-07-13 10:39:47.579031] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:31.229 [2024-07-13 10:39:47.579061] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.229 [2024-07-13 10:39:47.579171] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:31.229 [2024-07-13 10:39:47.579191] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.229 #55 NEW cov: 11822 ft: 14384 corp: 35/2703b lim: 90 exec/s: 55 rss: 70Mb L: 52/90 MS: 1 EraseBytes- 00:08:31.489 [2024-07-13 10:39:47.629963] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:31.489 [2024-07-13 10:39:47.629997] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.489 [2024-07-13 10:39:47.630097] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:31.489 [2024-07-13 10:39:47.630117] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.489 [2024-07-13 10:39:47.630230] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:31.489 [2024-07-13 10:39:47.630254] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.489 [2024-07-13 10:39:47.630363] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:31.489 [2024-07-13 10:39:47.630387] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:31.489 [2024-07-13 10:39:47.630514] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:4 nsid:0 00:08:31.489 [2024-07-13 10:39:47.630535] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:31.489 #56 NEW cov: 11822 ft: 14421 corp: 36/2793b lim: 90 exec/s: 56 rss: 70Mb L: 90/90 MS: 1 ChangeByte- 00:08:31.489 [2024-07-13 10:39:47.679834] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:31.489 [2024-07-13 10:39:47.679868] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.489 [2024-07-13 10:39:47.679979] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:31.489 [2024-07-13 10:39:47.680003] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.489 [2024-07-13 10:39:47.680119] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:31.489 [2024-07-13 10:39:47.680140] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.489 [2024-07-13 10:39:47.680256] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:31.489 [2024-07-13 10:39:47.680281] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:31.489 #57 NEW cov: 11822 ft: 14427 corp: 37/2877b lim: 90 exec/s: 57 rss: 70Mb L: 84/90 MS: 1 ShuffleBytes- 00:08:31.489 [2024-07-13 10:39:47.719937] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:31.489 [2024-07-13 10:39:47.719967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.489 [2024-07-13 10:39:47.720093] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:31.489 [2024-07-13 10:39:47.720116] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.489 [2024-07-13 10:39:47.720237] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:31.489 [2024-07-13 10:39:47.720258] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.489 [2024-07-13 10:39:47.720376] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:31.489 [2024-07-13 10:39:47.720395] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:31.489 #63 NEW cov: 11822 ft: 14457 corp: 38/2961b lim: 90 exec/s: 63 rss: 70Mb L: 84/90 MS: 1 ChangeBinInt- 00:08:31.489 [2024-07-13 10:39:47.759577] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:31.489 [2024-07-13 10:39:47.759612] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.489 [2024-07-13 10:39:47.759736] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:31.489 [2024-07-13 10:39:47.759756] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.489 #64 NEW cov: 11822 ft: 14471 corp: 39/3009b lim: 90 exec/s: 64 rss: 70Mb L: 48/90 MS: 1 ChangeBinInt- 00:08:31.489 [2024-07-13 10:39:47.800157] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:31.489 [2024-07-13 10:39:47.800188] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.489 [2024-07-13 10:39:47.800284] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:31.489 [2024-07-13 10:39:47.800321] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.489 [2024-07-13 10:39:47.800438] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:31.489 [2024-07-13 10:39:47.800465] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.489 [2024-07-13 10:39:47.800599] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:31.489 [2024-07-13 10:39:47.800620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:31.489 #65 NEW cov: 11822 ft: 14475 corp: 40/3093b lim: 90 exec/s: 65 rss: 70Mb L: 84/90 MS: 1 ShuffleBytes- 00:08:31.489 [2024-07-13 10:39:47.840257] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:31.490 [2024-07-13 10:39:47.840290] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.490 [2024-07-13 10:39:47.840399] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:31.490 [2024-07-13 10:39:47.840420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.490 [2024-07-13 10:39:47.840554] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:31.490 [2024-07-13 10:39:47.840577] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.490 [2024-07-13 10:39:47.840701] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:31.490 [2024-07-13 10:39:47.840719] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:31.490 #66 NEW cov: 11822 ft: 14488 corp: 41/3177b lim: 90 exec/s: 33 rss: 70Mb L: 84/90 MS: 1 ChangeBit- 00:08:31.490 #66 DONE cov: 11822 ft: 14488 corp: 41/3177b lim: 90 exec/s: 33 rss: 70Mb 00:08:31.490 ###### Recommended dictionary. ###### 00:08:31.490 "\377\377\377\011" # Uses: 1 00:08:31.490 ###### End of recommended dictionary. ###### 00:08:31.490 Done 66 runs in 2 second(s) 00:08:31.750 10:39:47 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_20.conf 00:08:31.750 10:39:47 -- ../common.sh@72 -- # (( i++ )) 00:08:31.750 10:39:47 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:31.750 10:39:47 -- ../common.sh@73 -- # start_llvm_fuzz 21 1 0x1 00:08:31.750 10:39:47 -- nvmf/run.sh@23 -- # local fuzzer_type=21 00:08:31.750 10:39:47 -- nvmf/run.sh@24 -- # local timen=1 00:08:31.750 10:39:47 -- nvmf/run.sh@25 -- # local core=0x1 00:08:31.750 10:39:47 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:08:31.750 10:39:47 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_21.conf 00:08:31.750 10:39:47 -- nvmf/run.sh@29 -- # printf %02d 21 00:08:31.750 10:39:47 -- nvmf/run.sh@29 -- # port=4421 00:08:31.750 10:39:47 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:08:31.750 10:39:47 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4421' 00:08:31.750 10:39:47 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4421"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:31.750 10:39:47 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4421' -c /tmp/fuzz_json_21.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 -Z 21 -r /var/tmp/spdk21.sock 00:08:31.750 [2024-07-13 10:39:48.005669] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:08:31.750 [2024-07-13 10:39:48.005730] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1998798 ] 00:08:31.750 EAL: No free 2048 kB hugepages reported on node 1 00:08:32.008 [2024-07-13 10:39:48.179900] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:32.008 [2024-07-13 10:39:48.199593] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:32.008 [2024-07-13 10:39:48.199715] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:32.008 [2024-07-13 10:39:48.251113] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:32.008 [2024-07-13 10:39:48.267375] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4421 *** 00:08:32.008 INFO: Running with entropic power schedule (0xFF, 100). 00:08:32.008 INFO: Seed: 3050627207 00:08:32.008 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x26ac74c, 0x26ffca9), 00:08:32.008 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x26ffcb0,0x2c35280), 00:08:32.008 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:08:32.008 INFO: A corpus is not provided, starting from an empty corpus 00:08:32.008 #2 INITED exec/s: 0 rss: 60Mb 00:08:32.008 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:32.008 This may also happen if the target rejected all inputs we tried so far 00:08:32.008 [2024-07-13 10:39:48.343416] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:32.009 [2024-07-13 10:39:48.343463] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.268 NEW_FUNC[1/672]: 0x4c3a20 in fuzz_nvm_reservation_release_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:623 00:08:32.268 NEW_FUNC[2/672]: 0x4dac50 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:32.268 #21 NEW cov: 11570 ft: 11571 corp: 2/16b lim: 50 exec/s: 0 rss: 68Mb L: 15/15 MS: 4 ShuffleBytes-ChangeBit-ChangeByte-InsertRepeatedBytes- 00:08:32.527 [2024-07-13 10:39:48.674113] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:32.527 [2024-07-13 10:39:48.674169] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.527 #28 NEW cov: 11683 ft: 12101 corp: 3/26b lim: 50 exec/s: 0 rss: 68Mb L: 10/15 MS: 2 CopyPart-InsertRepeatedBytes- 00:08:32.527 [2024-07-13 10:39:48.714191] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:32.527 [2024-07-13 10:39:48.714221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.527 #29 NEW cov: 11689 ft: 12415 corp: 4/36b lim: 50 exec/s: 0 rss: 68Mb L: 10/15 MS: 1 ShuffleBytes- 00:08:32.527 [2024-07-13 10:39:48.754533] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:32.527 [2024-07-13 10:39:48.754564] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.527 [2024-07-13 10:39:48.754692] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:32.527 [2024-07-13 10:39:48.754715] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.527 #30 NEW cov: 11774 ft: 13471 corp: 5/60b lim: 50 exec/s: 0 rss: 68Mb L: 24/24 MS: 1 CrossOver- 00:08:32.527 [2024-07-13 10:39:48.804366] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:32.527 [2024-07-13 10:39:48.804402] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.527 #38 NEW cov: 11774 ft: 13567 corp: 6/72b lim: 50 exec/s: 0 rss: 68Mb L: 12/24 MS: 3 CopyPart-CMP-CopyPart- DE: "\377\377~-l\013i\025"- 00:08:32.527 [2024-07-13 10:39:48.845195] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:32.527 [2024-07-13 10:39:48.845231] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.527 [2024-07-13 10:39:48.845343] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:32.527 [2024-07-13 10:39:48.845366] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.527 [2024-07-13 10:39:48.845502] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:32.527 [2024-07-13 10:39:48.845521] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.527 [2024-07-13 10:39:48.845650] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:32.527 [2024-07-13 10:39:48.845675] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:32.527 #39 NEW cov: 11774 ft: 14033 corp: 7/119b lim: 50 exec/s: 0 rss: 69Mb L: 47/47 MS: 1 InsertRepeatedBytes- 00:08:32.527 [2024-07-13 10:39:48.894735] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:32.527 [2024-07-13 10:39:48.894762] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.786 #40 NEW cov: 11774 ft: 14082 corp: 8/131b lim: 50 exec/s: 0 rss: 69Mb L: 12/47 MS: 1 CopyPart- 00:08:32.786 [2024-07-13 10:39:48.945573] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:32.786 [2024-07-13 10:39:48.945607] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.786 [2024-07-13 10:39:48.945684] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:32.786 [2024-07-13 10:39:48.945708] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.786 [2024-07-13 10:39:48.945831] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:32.786 [2024-07-13 10:39:48.945854] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.786 [2024-07-13 10:39:48.945968] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:32.786 [2024-07-13 10:39:48.945991] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:32.786 #41 NEW cov: 11774 ft: 14152 corp: 9/178b lim: 50 exec/s: 0 rss: 69Mb L: 47/47 MS: 1 ChangeBit- 00:08:32.786 [2024-07-13 10:39:48.994965] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:32.786 [2024-07-13 10:39:48.994992] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.786 #47 NEW cov: 11774 ft: 14196 corp: 10/188b lim: 50 exec/s: 0 rss: 69Mb L: 10/47 MS: 1 EraseBytes- 00:08:32.786 [2024-07-13 10:39:49.035224] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:32.786 [2024-07-13 10:39:49.035253] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.786 #48 NEW cov: 11774 ft: 14289 corp: 11/207b lim: 50 exec/s: 0 rss: 69Mb L: 19/47 MS: 1 CopyPart- 00:08:32.786 [2024-07-13 10:39:49.075372] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:32.786 [2024-07-13 10:39:49.075398] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.786 #49 NEW cov: 11774 ft: 14315 corp: 12/222b lim: 50 exec/s: 0 rss: 69Mb L: 15/47 MS: 1 ChangeBit- 00:08:32.786 [2024-07-13 10:39:49.115593] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:32.786 [2024-07-13 10:39:49.115626] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.786 [2024-07-13 10:39:49.115746] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:32.786 [2024-07-13 10:39:49.115771] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.786 #50 NEW cov: 11774 ft: 14339 corp: 13/244b lim: 50 exec/s: 0 rss: 69Mb L: 22/47 MS: 1 EraseBytes- 00:08:32.786 [2024-07-13 10:39:49.155491] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:32.786 [2024-07-13 10:39:49.155520] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.046 #51 NEW cov: 11774 ft: 14383 corp: 14/254b lim: 50 exec/s: 0 rss: 69Mb L: 10/47 MS: 1 ChangeBinInt- 00:08:33.046 [2024-07-13 10:39:49.196155] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:33.046 [2024-07-13 10:39:49.196187] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.046 [2024-07-13 10:39:49.196288] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:33.046 [2024-07-13 10:39:49.196311] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.046 [2024-07-13 10:39:49.196425] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:33.046 [2024-07-13 10:39:49.196448] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.046 NEW_FUNC[1/1]: 0x197bcf0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:33.046 #53 NEW cov: 11797 ft: 14657 corp: 15/285b lim: 50 exec/s: 0 rss: 69Mb L: 31/47 MS: 2 CrossOver-InsertRepeatedBytes- 00:08:33.046 [2024-07-13 10:39:49.236041] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:33.046 [2024-07-13 10:39:49.236071] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.046 [2024-07-13 10:39:49.236197] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:33.046 [2024-07-13 10:39:49.236221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.046 #54 NEW cov: 11797 ft: 14687 corp: 16/309b lim: 50 exec/s: 0 rss: 69Mb L: 24/47 MS: 1 CopyPart- 00:08:33.046 [2024-07-13 10:39:49.276140] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:33.046 [2024-07-13 10:39:49.276166] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.046 [2024-07-13 10:39:49.276298] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:33.046 [2024-07-13 10:39:49.276321] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.046 #60 NEW cov: 11797 ft: 14738 corp: 17/333b lim: 50 exec/s: 0 rss: 70Mb L: 24/47 MS: 1 CopyPart- 00:08:33.046 [2024-07-13 10:39:49.326792] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:33.046 [2024-07-13 10:39:49.326828] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.046 [2024-07-13 10:39:49.326924] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:33.046 [2024-07-13 10:39:49.326944] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.046 [2024-07-13 10:39:49.327066] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:33.046 [2024-07-13 10:39:49.327087] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.046 [2024-07-13 10:39:49.327208] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:33.046 [2024-07-13 10:39:49.327232] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:33.046 #61 NEW cov: 11797 ft: 14750 corp: 18/381b lim: 50 exec/s: 61 rss: 70Mb L: 48/48 MS: 1 InsertByte- 00:08:33.046 [2024-07-13 10:39:49.376455] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:33.046 [2024-07-13 10:39:49.376485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.046 [2024-07-13 10:39:49.376556] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:33.046 [2024-07-13 10:39:49.376574] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.046 #62 NEW cov: 11797 ft: 14822 corp: 19/407b lim: 50 exec/s: 62 rss: 70Mb L: 26/48 MS: 1 InsertRepeatedBytes- 00:08:33.046 [2024-07-13 10:39:49.416317] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:33.046 [2024-07-13 10:39:49.416344] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.305 #63 NEW cov: 11797 ft: 14841 corp: 20/417b lim: 50 exec/s: 63 rss: 70Mb L: 10/48 MS: 1 EraseBytes- 00:08:33.305 [2024-07-13 10:39:49.456385] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:33.305 [2024-07-13 10:39:49.456411] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.305 #64 NEW cov: 11797 ft: 14868 corp: 21/436b lim: 50 exec/s: 64 rss: 70Mb L: 19/48 MS: 1 ShuffleBytes- 00:08:33.305 [2024-07-13 10:39:49.496584] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:33.305 [2024-07-13 10:39:49.496611] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.305 #65 NEW cov: 11797 ft: 14895 corp: 22/451b lim: 50 exec/s: 65 rss: 70Mb L: 15/48 MS: 1 ChangeBinInt- 00:08:33.305 [2024-07-13 10:39:49.536711] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:33.305 [2024-07-13 10:39:49.536737] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.305 #66 NEW cov: 11797 ft: 14904 corp: 23/466b lim: 50 exec/s: 66 rss: 70Mb L: 15/48 MS: 1 ChangeBinInt- 00:08:33.305 [2024-07-13 10:39:49.577141] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:33.305 [2024-07-13 10:39:49.577166] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.306 [2024-07-13 10:39:49.577292] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:33.306 [2024-07-13 10:39:49.577311] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.306 #67 NEW cov: 11797 ft: 14943 corp: 24/488b lim: 50 exec/s: 67 rss: 70Mb L: 22/48 MS: 1 CrossOver- 00:08:33.306 [2024-07-13 10:39:49.616867] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:33.306 [2024-07-13 10:39:49.616893] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.306 #68 NEW cov: 11797 ft: 14981 corp: 25/507b lim: 50 exec/s: 68 rss: 70Mb L: 19/48 MS: 1 PersAutoDict- DE: "\377\377~-l\013i\025"- 00:08:33.306 [2024-07-13 10:39:49.657110] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:33.306 [2024-07-13 10:39:49.657143] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.306 #69 NEW cov: 11797 ft: 14995 corp: 26/521b lim: 50 exec/s: 69 rss: 70Mb L: 14/48 MS: 1 EraseBytes- 00:08:33.565 [2024-07-13 10:39:49.697283] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:33.565 [2024-07-13 10:39:49.697322] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.565 #70 NEW cov: 11797 ft: 15003 corp: 27/540b lim: 50 exec/s: 70 rss: 70Mb L: 19/48 MS: 1 ChangeBit- 00:08:33.565 [2024-07-13 10:39:49.737273] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:33.565 [2024-07-13 10:39:49.737300] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.565 #71 NEW cov: 11797 ft: 15037 corp: 28/555b lim: 50 exec/s: 71 rss: 70Mb L: 15/48 MS: 1 ShuffleBytes- 00:08:33.565 [2024-07-13 10:39:49.777447] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:33.565 [2024-07-13 10:39:49.777479] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.565 #72 NEW cov: 11797 ft: 15064 corp: 29/565b lim: 50 exec/s: 72 rss: 70Mb L: 10/48 MS: 1 ChangeBinInt- 00:08:33.565 [2024-07-13 10:39:49.818324] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:33.565 [2024-07-13 10:39:49.818355] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.565 [2024-07-13 10:39:49.818456] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:33.565 [2024-07-13 10:39:49.818480] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.565 [2024-07-13 10:39:49.818589] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:33.565 [2024-07-13 10:39:49.818609] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.565 [2024-07-13 10:39:49.818728] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:33.565 [2024-07-13 10:39:49.818750] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:33.565 #73 NEW cov: 11797 ft: 15076 corp: 30/613b lim: 50 exec/s: 73 rss: 70Mb L: 48/48 MS: 1 CrossOver- 00:08:33.565 [2024-07-13 10:39:49.858423] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:33.565 [2024-07-13 10:39:49.858458] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.565 [2024-07-13 10:39:49.858564] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:33.565 [2024-07-13 10:39:49.858588] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.565 [2024-07-13 10:39:49.858712] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:33.565 [2024-07-13 10:39:49.858734] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.565 [2024-07-13 10:39:49.858859] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:33.565 [2024-07-13 10:39:49.858876] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:33.565 #74 NEW cov: 11797 ft: 15151 corp: 31/660b lim: 50 exec/s: 74 rss: 70Mb L: 47/48 MS: 1 ChangeByte- 00:08:33.565 [2024-07-13 10:39:49.898549] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:33.565 [2024-07-13 10:39:49.898579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.565 [2024-07-13 10:39:49.898685] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:33.565 [2024-07-13 10:39:49.898706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.565 [2024-07-13 10:39:49.898818] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:33.565 [2024-07-13 10:39:49.898838] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.565 [2024-07-13 10:39:49.898960] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:33.565 [2024-07-13 10:39:49.898985] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:33.565 #75 NEW cov: 11797 ft: 15165 corp: 32/701b lim: 50 exec/s: 75 rss: 70Mb L: 41/48 MS: 1 InsertRepeatedBytes- 00:08:33.565 [2024-07-13 10:39:49.937993] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:33.565 [2024-07-13 10:39:49.938019] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.824 #76 NEW cov: 11797 ft: 15170 corp: 33/712b lim: 50 exec/s: 76 rss: 70Mb L: 11/48 MS: 1 CopyPart- 00:08:33.824 [2024-07-13 10:39:49.978402] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:33.824 [2024-07-13 10:39:49.978427] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.824 [2024-07-13 10:39:49.978545] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:33.824 [2024-07-13 10:39:49.978571] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.825 #77 NEW cov: 11797 ft: 15183 corp: 34/734b lim: 50 exec/s: 77 rss: 70Mb L: 22/48 MS: 1 EraseBytes- 00:08:33.825 [2024-07-13 10:39:50.018139] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:33.825 [2024-07-13 10:39:50.018166] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.825 #78 NEW cov: 11797 ft: 15191 corp: 35/745b lim: 50 exec/s: 78 rss: 70Mb L: 11/48 MS: 1 InsertByte- 00:08:33.825 [2024-07-13 10:39:50.058861] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:33.825 [2024-07-13 10:39:50.058896] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.825 [2024-07-13 10:39:50.059018] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:33.825 [2024-07-13 10:39:50.059041] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.825 [2024-07-13 10:39:50.059169] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:33.825 [2024-07-13 10:39:50.059192] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.825 #79 NEW cov: 11797 ft: 15199 corp: 36/776b lim: 50 exec/s: 79 rss: 70Mb L: 31/48 MS: 1 ChangeBit- 00:08:33.825 [2024-07-13 10:39:50.108491] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:33.825 [2024-07-13 10:39:50.108523] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.825 #80 NEW cov: 11797 ft: 15300 corp: 37/795b lim: 50 exec/s: 80 rss: 70Mb L: 19/48 MS: 1 ChangeBit- 00:08:33.825 [2024-07-13 10:39:50.149325] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:33.825 [2024-07-13 10:39:50.149357] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.825 [2024-07-13 10:39:50.149446] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:33.825 [2024-07-13 10:39:50.149467] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.825 [2024-07-13 10:39:50.149583] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:33.825 [2024-07-13 10:39:50.149604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.825 [2024-07-13 10:39:50.149738] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:33.825 [2024-07-13 10:39:50.149759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:33.825 #81 NEW cov: 11797 ft: 15312 corp: 38/836b lim: 50 exec/s: 81 rss: 70Mb L: 41/48 MS: 1 EraseBytes- 00:08:33.825 [2024-07-13 10:39:50.188859] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:33.825 [2024-07-13 10:39:50.188895] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.825 [2024-07-13 10:39:50.189017] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:33.825 [2024-07-13 10:39:50.189040] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.825 #82 NEW cov: 11797 ft: 15333 corp: 39/865b lim: 50 exec/s: 82 rss: 70Mb L: 29/48 MS: 1 CrossOver- 00:08:34.084 [2024-07-13 10:39:50.228839] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:34.084 [2024-07-13 10:39:50.228866] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.084 #83 NEW cov: 11797 ft: 15346 corp: 40/884b lim: 50 exec/s: 83 rss: 70Mb L: 19/48 MS: 1 ChangeBinInt- 00:08:34.084 [2024-07-13 10:39:50.269372] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:34.084 [2024-07-13 10:39:50.269405] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.084 [2024-07-13 10:39:50.269532] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:34.084 [2024-07-13 10:39:50.269562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.084 [2024-07-13 10:39:50.269692] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:34.084 [2024-07-13 10:39:50.269713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.084 #84 NEW cov: 11797 ft: 15364 corp: 41/915b lim: 50 exec/s: 84 rss: 70Mb L: 31/48 MS: 1 EraseBytes- 00:08:34.084 [2024-07-13 10:39:50.309310] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:34.084 [2024-07-13 10:39:50.309342] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.084 [2024-07-13 10:39:50.309470] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:34.084 [2024-07-13 10:39:50.309489] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.084 #85 NEW cov: 11797 ft: 15369 corp: 42/941b lim: 50 exec/s: 42 rss: 70Mb L: 26/48 MS: 1 ShuffleBytes- 00:08:34.084 #85 DONE cov: 11797 ft: 15369 corp: 42/941b lim: 50 exec/s: 42 rss: 70Mb 00:08:34.084 ###### Recommended dictionary. ###### 00:08:34.084 "\377\377~-l\013i\025" # Uses: 3 00:08:34.084 ###### End of recommended dictionary. ###### 00:08:34.084 Done 85 runs in 2 second(s) 00:08:34.084 10:39:50 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_21.conf 00:08:34.084 10:39:50 -- ../common.sh@72 -- # (( i++ )) 00:08:34.084 10:39:50 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:34.084 10:39:50 -- ../common.sh@73 -- # start_llvm_fuzz 22 1 0x1 00:08:34.084 10:39:50 -- nvmf/run.sh@23 -- # local fuzzer_type=22 00:08:34.084 10:39:50 -- nvmf/run.sh@24 -- # local timen=1 00:08:34.084 10:39:50 -- nvmf/run.sh@25 -- # local core=0x1 00:08:34.084 10:39:50 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:08:34.084 10:39:50 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_22.conf 00:08:34.084 10:39:50 -- nvmf/run.sh@29 -- # printf %02d 22 00:08:34.084 10:39:50 -- nvmf/run.sh@29 -- # port=4422 00:08:34.084 10:39:50 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:08:34.084 10:39:50 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4422' 00:08:34.084 10:39:50 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4422"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:34.084 10:39:50 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4422' -c /tmp/fuzz_json_22.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 -Z 22 -r /var/tmp/spdk22.sock 00:08:34.343 [2024-07-13 10:39:50.473453] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:08:34.343 [2024-07-13 10:39:50.473519] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1999196 ] 00:08:34.343 EAL: No free 2048 kB hugepages reported on node 1 00:08:34.343 [2024-07-13 10:39:50.650753] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:34.343 [2024-07-13 10:39:50.670197] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:34.343 [2024-07-13 10:39:50.670340] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:34.343 [2024-07-13 10:39:50.721783] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:34.602 [2024-07-13 10:39:50.738067] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4422 *** 00:08:34.602 INFO: Running with entropic power schedule (0xFF, 100). 00:08:34.602 INFO: Seed: 1227656564 00:08:34.602 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x26ac74c, 0x26ffca9), 00:08:34.602 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x26ffcb0,0x2c35280), 00:08:34.602 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:08:34.602 INFO: A corpus is not provided, starting from an empty corpus 00:08:34.602 #2 INITED exec/s: 0 rss: 60Mb 00:08:34.602 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:34.603 This may also happen if the target rejected all inputs we tried so far 00:08:34.603 [2024-07-13 10:39:50.804356] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:34.603 [2024-07-13 10:39:50.804391] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.603 [2024-07-13 10:39:50.804516] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:34.603 [2024-07-13 10:39:50.804543] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.603 [2024-07-13 10:39:50.804661] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:34.603 [2024-07-13 10:39:50.804689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.862 NEW_FUNC[1/672]: 0x4c5ce0 in fuzz_nvm_reservation_register_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:644 00:08:34.862 NEW_FUNC[2/672]: 0x4dac50 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:34.862 #4 NEW cov: 11596 ft: 11593 corp: 2/65b lim: 85 exec/s: 0 rss: 68Mb L: 64/64 MS: 2 ChangeBit-InsertRepeatedBytes- 00:08:34.862 [2024-07-13 10:39:51.135362] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:34.862 [2024-07-13 10:39:51.135418] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.862 [2024-07-13 10:39:51.135556] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:34.862 [2024-07-13 10:39:51.135587] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.862 [2024-07-13 10:39:51.135724] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:34.862 [2024-07-13 10:39:51.135759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.862 #7 NEW cov: 11709 ft: 12131 corp: 3/126b lim: 85 exec/s: 0 rss: 68Mb L: 61/64 MS: 3 ChangeBit-ChangeByte-InsertRepeatedBytes- 00:08:34.862 [2024-07-13 10:39:51.174962] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:34.862 [2024-07-13 10:39:51.174994] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.862 [2024-07-13 10:39:51.175129] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:34.862 [2024-07-13 10:39:51.175151] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.862 [2024-07-13 10:39:51.175274] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:34.862 [2024-07-13 10:39:51.175298] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.862 #18 NEW cov: 11715 ft: 12526 corp: 4/190b lim: 85 exec/s: 0 rss: 68Mb L: 64/64 MS: 1 ShuffleBytes- 00:08:34.862 [2024-07-13 10:39:51.225591] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:34.862 [2024-07-13 10:39:51.225616] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.862 [2024-07-13 10:39:51.225737] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:34.862 [2024-07-13 10:39:51.225759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.862 [2024-07-13 10:39:51.225883] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:34.862 [2024-07-13 10:39:51.225903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.122 #19 NEW cov: 11800 ft: 12925 corp: 5/251b lim: 85 exec/s: 0 rss: 68Mb L: 61/64 MS: 1 CopyPart- 00:08:35.122 [2024-07-13 10:39:51.285582] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:35.122 [2024-07-13 10:39:51.285612] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.122 [2024-07-13 10:39:51.285704] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:35.122 [2024-07-13 10:39:51.285723] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.122 [2024-07-13 10:39:51.285851] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:35.122 [2024-07-13 10:39:51.285871] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.122 #20 NEW cov: 11800 ft: 13086 corp: 6/315b lim: 85 exec/s: 0 rss: 68Mb L: 64/64 MS: 1 CMP- DE: "\377(h\306\351\233\222\010"- 00:08:35.122 [2024-07-13 10:39:51.335852] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:35.122 [2024-07-13 10:39:51.335878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.122 [2024-07-13 10:39:51.336003] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:35.122 [2024-07-13 10:39:51.336024] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.122 [2024-07-13 10:39:51.336141] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:35.122 [2024-07-13 10:39:51.336162] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.122 #21 NEW cov: 11800 ft: 13236 corp: 7/379b lim: 85 exec/s: 0 rss: 68Mb L: 64/64 MS: 1 CrossOver- 00:08:35.122 [2024-07-13 10:39:51.395699] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:35.122 [2024-07-13 10:39:51.395724] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.122 #25 NEW cov: 11800 ft: 14130 corp: 8/409b lim: 85 exec/s: 0 rss: 68Mb L: 30/64 MS: 4 ShuffleBytes-ChangeByte-CopyPart-InsertRepeatedBytes- 00:08:35.122 [2024-07-13 10:39:51.435904] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:35.122 [2024-07-13 10:39:51.435936] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.122 [2024-07-13 10:39:51.436050] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:35.122 [2024-07-13 10:39:51.436073] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.122 [2024-07-13 10:39:51.436188] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:35.122 [2024-07-13 10:39:51.436205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.122 #26 NEW cov: 11800 ft: 14185 corp: 9/470b lim: 85 exec/s: 0 rss: 68Mb L: 61/64 MS: 1 ChangeByte- 00:08:35.122 [2024-07-13 10:39:51.486474] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:35.122 [2024-07-13 10:39:51.486507] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.122 [2024-07-13 10:39:51.486640] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:35.122 [2024-07-13 10:39:51.486660] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.122 [2024-07-13 10:39:51.486779] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:35.122 [2024-07-13 10:39:51.486801] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.382 #27 NEW cov: 11809 ft: 14290 corp: 10/535b lim: 85 exec/s: 0 rss: 69Mb L: 65/65 MS: 1 CrossOver- 00:08:35.382 [2024-07-13 10:39:51.536535] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:35.382 [2024-07-13 10:39:51.536569] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.382 [2024-07-13 10:39:51.536673] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:35.382 [2024-07-13 10:39:51.536696] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.382 [2024-07-13 10:39:51.536822] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:35.382 [2024-07-13 10:39:51.536844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.382 #28 NEW cov: 11809 ft: 14347 corp: 11/599b lim: 85 exec/s: 0 rss: 69Mb L: 64/65 MS: 1 ChangeBit- 00:08:35.382 [2024-07-13 10:39:51.586724] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:35.382 [2024-07-13 10:39:51.586755] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.382 [2024-07-13 10:39:51.586869] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:35.382 [2024-07-13 10:39:51.586889] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.382 [2024-07-13 10:39:51.587012] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:35.382 [2024-07-13 10:39:51.587035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.382 #29 NEW cov: 11809 ft: 14400 corp: 12/665b lim: 85 exec/s: 0 rss: 69Mb L: 66/66 MS: 1 CrossOver- 00:08:35.382 [2024-07-13 10:39:51.626335] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:35.382 [2024-07-13 10:39:51.626366] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.382 [2024-07-13 10:39:51.626486] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:35.382 [2024-07-13 10:39:51.626506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.382 [2024-07-13 10:39:51.626630] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:35.382 [2024-07-13 10:39:51.626650] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.382 #30 NEW cov: 11809 ft: 14441 corp: 13/729b lim: 85 exec/s: 0 rss: 69Mb L: 64/66 MS: 1 ChangeByte- 00:08:35.382 [2024-07-13 10:39:51.666497] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:35.382 [2024-07-13 10:39:51.666528] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.382 [2024-07-13 10:39:51.666628] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:35.382 [2024-07-13 10:39:51.666649] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.382 [2024-07-13 10:39:51.666776] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:35.382 [2024-07-13 10:39:51.666803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.382 NEW_FUNC[1/1]: 0x197bcf0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:35.382 #31 NEW cov: 11832 ft: 14489 corp: 14/793b lim: 85 exec/s: 0 rss: 69Mb L: 64/66 MS: 1 ShuffleBytes- 00:08:35.382 [2024-07-13 10:39:51.716886] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:35.382 [2024-07-13 10:39:51.716919] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.382 [2024-07-13 10:39:51.717040] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:35.382 [2024-07-13 10:39:51.717059] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.382 #32 NEW cov: 11832 ft: 14818 corp: 15/843b lim: 85 exec/s: 0 rss: 69Mb L: 50/66 MS: 1 EraseBytes- 00:08:35.382 [2024-07-13 10:39:51.767483] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:35.382 [2024-07-13 10:39:51.767516] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.382 [2024-07-13 10:39:51.767648] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:35.382 [2024-07-13 10:39:51.767674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.382 [2024-07-13 10:39:51.767804] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:35.382 [2024-07-13 10:39:51.767822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.642 #33 NEW cov: 11832 ft: 14826 corp: 16/907b lim: 85 exec/s: 33 rss: 69Mb L: 64/66 MS: 1 CopyPart- 00:08:35.642 [2024-07-13 10:39:51.817409] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:35.642 [2024-07-13 10:39:51.817447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.642 [2024-07-13 10:39:51.817567] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:35.642 [2024-07-13 10:39:51.817588] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.642 [2024-07-13 10:39:51.817715] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:35.642 [2024-07-13 10:39:51.817736] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.642 #34 NEW cov: 11832 ft: 14841 corp: 17/968b lim: 85 exec/s: 34 rss: 69Mb L: 61/66 MS: 1 PersAutoDict- DE: "\377(h\306\351\233\222\010"- 00:08:35.642 [2024-07-13 10:39:51.856589] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:35.642 [2024-07-13 10:39:51.856615] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.642 #35 NEW cov: 11832 ft: 14867 corp: 18/996b lim: 85 exec/s: 35 rss: 70Mb L: 28/66 MS: 1 CrossOver- 00:08:35.642 [2024-07-13 10:39:51.917749] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:35.642 [2024-07-13 10:39:51.917782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.642 [2024-07-13 10:39:51.917907] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:35.642 [2024-07-13 10:39:51.917928] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.642 [2024-07-13 10:39:51.918053] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:35.642 [2024-07-13 10:39:51.918072] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.642 #36 NEW cov: 11832 ft: 14904 corp: 19/1060b lim: 85 exec/s: 36 rss: 70Mb L: 64/66 MS: 1 ChangeByte- 00:08:35.642 [2024-07-13 10:39:51.967267] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:35.642 [2024-07-13 10:39:51.967296] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.642 #39 NEW cov: 11832 ft: 14934 corp: 20/1080b lim: 85 exec/s: 39 rss: 70Mb L: 20/66 MS: 3 CrossOver-PersAutoDict-CopyPart- DE: "\377(h\306\351\233\222\010"- 00:08:35.642 [2024-07-13 10:39:52.028325] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:35.642 [2024-07-13 10:39:52.028360] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.642 [2024-07-13 10:39:52.028475] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:35.642 [2024-07-13 10:39:52.028494] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.642 [2024-07-13 10:39:52.028616] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:35.642 [2024-07-13 10:39:52.028643] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.642 [2024-07-13 10:39:52.028770] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:35.642 [2024-07-13 10:39:52.028792] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:35.902 #40 NEW cov: 11832 ft: 15267 corp: 21/1163b lim: 85 exec/s: 40 rss: 70Mb L: 83/83 MS: 1 InsertRepeatedBytes- 00:08:35.902 [2024-07-13 10:39:52.088232] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:35.902 [2024-07-13 10:39:52.088266] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.902 [2024-07-13 10:39:52.088390] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:35.902 [2024-07-13 10:39:52.088413] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.902 [2024-07-13 10:39:52.088538] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:35.902 [2024-07-13 10:39:52.088560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.902 #41 NEW cov: 11832 ft: 15282 corp: 22/1229b lim: 85 exec/s: 41 rss: 70Mb L: 66/83 MS: 1 CMP- DE: "\377\377"- 00:08:35.902 [2024-07-13 10:39:52.148432] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:35.903 [2024-07-13 10:39:52.148467] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.903 [2024-07-13 10:39:52.148564] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:35.903 [2024-07-13 10:39:52.148585] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.903 [2024-07-13 10:39:52.148715] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:35.903 [2024-07-13 10:39:52.148735] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.903 #42 NEW cov: 11832 ft: 15284 corp: 23/1290b lim: 85 exec/s: 42 rss: 70Mb L: 61/83 MS: 1 CopyPart- 00:08:35.903 [2024-07-13 10:39:52.198551] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:35.903 [2024-07-13 10:39:52.198584] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.903 [2024-07-13 10:39:52.198732] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:35.903 [2024-07-13 10:39:52.198757] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.903 [2024-07-13 10:39:52.198884] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:35.903 [2024-07-13 10:39:52.198900] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.903 #43 NEW cov: 11832 ft: 15349 corp: 24/1354b lim: 85 exec/s: 43 rss: 70Mb L: 64/83 MS: 1 ChangeByte- 00:08:35.903 [2024-07-13 10:39:52.248117] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:35.903 [2024-07-13 10:39:52.248142] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.903 #44 NEW cov: 11832 ft: 15394 corp: 25/1385b lim: 85 exec/s: 44 rss: 70Mb L: 31/83 MS: 1 InsertByte- 00:08:36.162 [2024-07-13 10:39:52.308396] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:36.162 [2024-07-13 10:39:52.308422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.162 #45 NEW cov: 11832 ft: 15413 corp: 26/1406b lim: 85 exec/s: 45 rss: 70Mb L: 21/83 MS: 1 InsertByte- 00:08:36.162 [2024-07-13 10:39:52.359050] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:36.162 [2024-07-13 10:39:52.359087] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.162 [2024-07-13 10:39:52.359202] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:36.162 [2024-07-13 10:39:52.359224] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.162 [2024-07-13 10:39:52.359346] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:36.162 [2024-07-13 10:39:52.359366] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.162 #46 NEW cov: 11832 ft: 15429 corp: 27/1470b lim: 85 exec/s: 46 rss: 70Mb L: 64/83 MS: 1 ChangeByte- 00:08:36.162 [2024-07-13 10:39:52.409279] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:36.162 [2024-07-13 10:39:52.409307] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.162 [2024-07-13 10:39:52.409427] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:36.162 [2024-07-13 10:39:52.409453] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.162 [2024-07-13 10:39:52.409580] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:36.162 [2024-07-13 10:39:52.409603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.162 #47 NEW cov: 11832 ft: 15493 corp: 28/1531b lim: 85 exec/s: 47 rss: 70Mb L: 61/83 MS: 1 ChangeBinInt- 00:08:36.162 [2024-07-13 10:39:52.459265] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:36.162 [2024-07-13 10:39:52.459300] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.162 [2024-07-13 10:39:52.459415] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:36.163 [2024-07-13 10:39:52.459438] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.163 [2024-07-13 10:39:52.459589] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:36.163 [2024-07-13 10:39:52.459612] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.163 #48 NEW cov: 11832 ft: 15516 corp: 29/1592b lim: 85 exec/s: 48 rss: 70Mb L: 61/83 MS: 1 PersAutoDict- DE: "\377(h\306\351\233\222\010"- 00:08:36.163 [2024-07-13 10:39:52.509693] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:36.163 [2024-07-13 10:39:52.509721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.163 [2024-07-13 10:39:52.509817] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:36.163 [2024-07-13 10:39:52.509837] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.163 [2024-07-13 10:39:52.509952] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:36.163 [2024-07-13 10:39:52.509975] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.163 [2024-07-13 10:39:52.510102] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:36.163 [2024-07-13 10:39:52.510124] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:36.163 #49 NEW cov: 11832 ft: 15542 corp: 30/1665b lim: 85 exec/s: 49 rss: 70Mb L: 73/83 MS: 1 CopyPart- 00:08:36.422 [2024-07-13 10:39:52.549957] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:36.422 [2024-07-13 10:39:52.549989] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.422 [2024-07-13 10:39:52.550107] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:36.422 [2024-07-13 10:39:52.550130] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.422 [2024-07-13 10:39:52.550248] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:36.423 [2024-07-13 10:39:52.550281] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.423 [2024-07-13 10:39:52.550405] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:36.423 [2024-07-13 10:39:52.550428] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:36.423 #50 NEW cov: 11832 ft: 15549 corp: 31/1738b lim: 85 exec/s: 50 rss: 70Mb L: 73/83 MS: 1 CrossOver- 00:08:36.423 [2024-07-13 10:39:52.599385] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:36.423 [2024-07-13 10:39:52.599414] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.423 [2024-07-13 10:39:52.599536] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:36.423 [2024-07-13 10:39:52.599562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.423 [2024-07-13 10:39:52.599693] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:36.423 [2024-07-13 10:39:52.599715] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.423 #51 NEW cov: 11832 ft: 15586 corp: 32/1799b lim: 85 exec/s: 51 rss: 70Mb L: 61/83 MS: 1 PersAutoDict- DE: "\377(h\306\351\233\222\010"- 00:08:36.423 [2024-07-13 10:39:52.649914] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:36.423 [2024-07-13 10:39:52.649943] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.423 [2024-07-13 10:39:52.650062] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:36.423 [2024-07-13 10:39:52.650083] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.423 [2024-07-13 10:39:52.650202] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:36.423 [2024-07-13 10:39:52.650225] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.423 #52 NEW cov: 11832 ft: 15650 corp: 33/1863b lim: 85 exec/s: 52 rss: 70Mb L: 64/83 MS: 1 ChangeBinInt- 00:08:36.423 [2024-07-13 10:39:52.699609] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:36.423 [2024-07-13 10:39:52.699636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.423 #53 NEW cov: 11832 ft: 15658 corp: 34/1883b lim: 85 exec/s: 53 rss: 70Mb L: 20/83 MS: 1 ShuffleBytes- 00:08:36.423 [2024-07-13 10:39:52.739228] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:36.423 [2024-07-13 10:39:52.739261] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.423 #54 NEW cov: 11832 ft: 15662 corp: 35/1903b lim: 85 exec/s: 54 rss: 70Mb L: 20/83 MS: 1 ChangeBit- 00:08:36.423 [2024-07-13 10:39:52.790351] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:36.423 [2024-07-13 10:39:52.790383] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.423 [2024-07-13 10:39:52.790532] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:36.423 [2024-07-13 10:39:52.790557] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.423 [2024-07-13 10:39:52.790686] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:36.423 [2024-07-13 10:39:52.790709] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.681 #55 NEW cov: 11832 ft: 15742 corp: 36/1967b lim: 85 exec/s: 27 rss: 70Mb L: 64/83 MS: 1 ChangeBit- 00:08:36.681 #55 DONE cov: 11832 ft: 15742 corp: 36/1967b lim: 85 exec/s: 27 rss: 70Mb 00:08:36.681 ###### Recommended dictionary. ###### 00:08:36.681 "\377(h\306\351\233\222\010" # Uses: 4 00:08:36.681 "\377\377" # Uses: 0 00:08:36.681 ###### End of recommended dictionary. ###### 00:08:36.681 Done 55 runs in 2 second(s) 00:08:36.681 10:39:52 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_22.conf 00:08:36.681 10:39:52 -- ../common.sh@72 -- # (( i++ )) 00:08:36.681 10:39:52 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:36.681 10:39:52 -- ../common.sh@73 -- # start_llvm_fuzz 23 1 0x1 00:08:36.681 10:39:52 -- nvmf/run.sh@23 -- # local fuzzer_type=23 00:08:36.681 10:39:52 -- nvmf/run.sh@24 -- # local timen=1 00:08:36.681 10:39:52 -- nvmf/run.sh@25 -- # local core=0x1 00:08:36.681 10:39:52 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:08:36.681 10:39:52 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_23.conf 00:08:36.681 10:39:52 -- nvmf/run.sh@29 -- # printf %02d 23 00:08:36.681 10:39:52 -- nvmf/run.sh@29 -- # port=4423 00:08:36.681 10:39:52 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:08:36.681 10:39:52 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4423' 00:08:36.681 10:39:52 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4423"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:36.681 10:39:52 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4423' -c /tmp/fuzz_json_23.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 -Z 23 -r /var/tmp/spdk23.sock 00:08:36.681 [2024-07-13 10:39:52.954056] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:08:36.681 [2024-07-13 10:39:52.954116] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1999629 ] 00:08:36.681 EAL: No free 2048 kB hugepages reported on node 1 00:08:36.939 [2024-07-13 10:39:53.129838] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:36.939 [2024-07-13 10:39:53.150563] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:36.939 [2024-07-13 10:39:53.150689] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:36.939 [2024-07-13 10:39:53.202320] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:36.939 [2024-07-13 10:39:53.218583] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4423 *** 00:08:36.939 INFO: Running with entropic power schedule (0xFF, 100). 00:08:36.939 INFO: Seed: 3708651330 00:08:36.939 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x26ac74c, 0x26ffca9), 00:08:36.939 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x26ffcb0,0x2c35280), 00:08:36.939 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:08:36.939 INFO: A corpus is not provided, starting from an empty corpus 00:08:36.939 #2 INITED exec/s: 0 rss: 57Mb 00:08:36.939 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:36.939 This may also happen if the target rejected all inputs we tried so far 00:08:36.939 [2024-07-13 10:39:53.273898] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:36.939 [2024-07-13 10:39:53.273932] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.939 [2024-07-13 10:39:53.273991] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:36.939 [2024-07-13 10:39:53.274009] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.198 NEW_FUNC[1/671]: 0x4c8f10 in fuzz_nvm_reservation_report_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:671 00:08:37.198 NEW_FUNC[2/671]: 0x4dac50 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:37.198 #17 NEW cov: 11529 ft: 11526 corp: 2/15b lim: 25 exec/s: 0 rss: 65Mb L: 14/14 MS: 5 ChangeByte-ChangeBit-CopyPart-CopyPart-InsertRepeatedBytes- 00:08:37.458 [2024-07-13 10:39:53.605934] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:37.458 [2024-07-13 10:39:53.605980] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.458 [2024-07-13 10:39:53.606117] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:37.458 [2024-07-13 10:39:53.606146] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.458 [2024-07-13 10:39:53.606285] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:37.458 [2024-07-13 10:39:53.606313] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:37.458 [2024-07-13 10:39:53.606455] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:37.458 [2024-07-13 10:39:53.606484] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:37.458 #18 NEW cov: 11642 ft: 12755 corp: 3/39b lim: 25 exec/s: 0 rss: 65Mb L: 24/24 MS: 1 InsertRepeatedBytes- 00:08:37.458 [2024-07-13 10:39:53.665506] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:37.458 [2024-07-13 10:39:53.665541] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.458 [2024-07-13 10:39:53.665692] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:37.458 [2024-07-13 10:39:53.665715] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.458 #26 NEW cov: 11648 ft: 12983 corp: 4/53b lim: 25 exec/s: 0 rss: 65Mb L: 14/24 MS: 3 CrossOver-ChangeBit-InsertRepeatedBytes- 00:08:37.458 [2024-07-13 10:39:53.715865] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:37.458 [2024-07-13 10:39:53.715896] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.458 [2024-07-13 10:39:53.716003] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:37.458 [2024-07-13 10:39:53.716023] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.458 [2024-07-13 10:39:53.716147] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:37.458 [2024-07-13 10:39:53.716170] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:37.458 #27 NEW cov: 11733 ft: 13468 corp: 5/71b lim: 25 exec/s: 0 rss: 65Mb L: 18/24 MS: 1 CrossOver- 00:08:37.458 [2024-07-13 10:39:53.755340] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:37.458 [2024-07-13 10:39:53.755365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.458 [2024-07-13 10:39:53.755508] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:37.458 [2024-07-13 10:39:53.755530] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.458 #30 NEW cov: 11733 ft: 13600 corp: 6/83b lim: 25 exec/s: 0 rss: 65Mb L: 12/24 MS: 3 ChangeByte-ChangeByte-InsertRepeatedBytes- 00:08:37.458 [2024-07-13 10:39:53.795848] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:37.458 [2024-07-13 10:39:53.795880] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.458 [2024-07-13 10:39:53.795962] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:37.458 [2024-07-13 10:39:53.795982] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.458 [2024-07-13 10:39:53.796098] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:37.458 [2024-07-13 10:39:53.796136] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:37.458 [2024-07-13 10:39:53.796263] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:37.458 [2024-07-13 10:39:53.796288] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:37.458 #33 NEW cov: 11733 ft: 13714 corp: 7/104b lim: 25 exec/s: 0 rss: 65Mb L: 21/24 MS: 3 ShuffleBytes-CopyPart-InsertRepeatedBytes- 00:08:37.458 [2024-07-13 10:39:53.836098] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:37.458 [2024-07-13 10:39:53.836131] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.458 [2024-07-13 10:39:53.836238] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:37.458 [2024-07-13 10:39:53.836261] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.458 [2024-07-13 10:39:53.836383] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:37.458 [2024-07-13 10:39:53.836404] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:37.718 #34 NEW cov: 11733 ft: 13742 corp: 8/122b lim: 25 exec/s: 0 rss: 65Mb L: 18/24 MS: 1 CMP- DE: "\001\000\000\000"- 00:08:37.718 [2024-07-13 10:39:53.875977] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:37.718 [2024-07-13 10:39:53.876010] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.718 [2024-07-13 10:39:53.876099] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:37.718 [2024-07-13 10:39:53.876119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.718 [2024-07-13 10:39:53.876242] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:37.718 [2024-07-13 10:39:53.876263] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:37.718 [2024-07-13 10:39:53.876399] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:37.718 [2024-07-13 10:39:53.876420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:37.718 #35 NEW cov: 11733 ft: 13805 corp: 9/143b lim: 25 exec/s: 0 rss: 65Mb L: 21/24 MS: 1 PersAutoDict- DE: "\001\000\000\000"- 00:08:37.718 [2024-07-13 10:39:53.926593] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:37.718 [2024-07-13 10:39:53.926620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.718 [2024-07-13 10:39:53.926689] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:37.718 [2024-07-13 10:39:53.926711] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.718 [2024-07-13 10:39:53.926837] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:37.718 [2024-07-13 10:39:53.926860] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:37.718 [2024-07-13 10:39:53.926995] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:37.718 [2024-07-13 10:39:53.927018] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:37.718 #36 NEW cov: 11733 ft: 13825 corp: 10/164b lim: 25 exec/s: 0 rss: 65Mb L: 21/24 MS: 1 CrossOver- 00:08:37.718 [2024-07-13 10:39:53.976400] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:37.718 [2024-07-13 10:39:53.976432] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.718 [2024-07-13 10:39:53.976562] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:37.718 [2024-07-13 10:39:53.976585] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.718 #37 NEW cov: 11733 ft: 13906 corp: 11/177b lim: 25 exec/s: 0 rss: 66Mb L: 13/24 MS: 1 EraseBytes- 00:08:37.718 [2024-07-13 10:39:54.017098] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:37.718 [2024-07-13 10:39:54.017128] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.718 [2024-07-13 10:39:54.017233] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:37.718 [2024-07-13 10:39:54.017255] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.718 [2024-07-13 10:39:54.017385] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:37.718 [2024-07-13 10:39:54.017407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:37.718 [2024-07-13 10:39:54.017534] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:37.718 [2024-07-13 10:39:54.017556] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:37.718 [2024-07-13 10:39:54.017692] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:37.718 [2024-07-13 10:39:54.017714] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:37.718 #38 NEW cov: 11733 ft: 13973 corp: 12/202b lim: 25 exec/s: 0 rss: 66Mb L: 25/25 MS: 1 CopyPart- 00:08:37.718 [2024-07-13 10:39:54.056195] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:37.718 [2024-07-13 10:39:54.056229] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.718 [2024-07-13 10:39:54.056353] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:37.718 [2024-07-13 10:39:54.056377] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.718 #39 NEW cov: 11733 ft: 14004 corp: 13/216b lim: 25 exec/s: 0 rss: 66Mb L: 14/25 MS: 1 ChangeBit- 00:08:37.977 [2024-07-13 10:39:54.106834] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:37.977 [2024-07-13 10:39:54.106867] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.978 [2024-07-13 10:39:54.106987] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:37.978 [2024-07-13 10:39:54.107011] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.978 #40 NEW cov: 11733 ft: 14089 corp: 14/228b lim: 25 exec/s: 0 rss: 66Mb L: 12/25 MS: 1 ChangeBit- 00:08:37.978 [2024-07-13 10:39:54.147370] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:37.978 [2024-07-13 10:39:54.147397] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.978 [2024-07-13 10:39:54.147508] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:37.978 [2024-07-13 10:39:54.147530] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.978 [2024-07-13 10:39:54.147647] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:37.978 [2024-07-13 10:39:54.147670] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:37.978 [2024-07-13 10:39:54.147794] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:37.978 [2024-07-13 10:39:54.147817] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:37.978 [2024-07-13 10:39:54.147934] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:37.978 [2024-07-13 10:39:54.147954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:37.978 NEW_FUNC[1/1]: 0x197bcf0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:37.978 #41 NEW cov: 11756 ft: 14190 corp: 15/253b lim: 25 exec/s: 0 rss: 66Mb L: 25/25 MS: 1 CrossOver- 00:08:37.978 [2024-07-13 10:39:54.197074] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:37.978 [2024-07-13 10:39:54.197101] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.978 [2024-07-13 10:39:54.197240] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:37.978 [2024-07-13 10:39:54.197255] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.978 #42 NEW cov: 11756 ft: 14202 corp: 16/266b lim: 25 exec/s: 0 rss: 66Mb L: 13/25 MS: 1 InsertByte- 00:08:37.978 [2024-07-13 10:39:54.236778] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:37.978 [2024-07-13 10:39:54.236804] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.978 [2024-07-13 10:39:54.236930] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:37.978 [2024-07-13 10:39:54.236951] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.978 #43 NEW cov: 11756 ft: 14215 corp: 17/278b lim: 25 exec/s: 43 rss: 66Mb L: 12/25 MS: 1 ChangeBit- 00:08:37.978 [2024-07-13 10:39:54.287373] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:37.978 [2024-07-13 10:39:54.287406] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.978 [2024-07-13 10:39:54.287547] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:37.978 [2024-07-13 10:39:54.287571] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.978 #44 NEW cov: 11756 ft: 14274 corp: 18/292b lim: 25 exec/s: 44 rss: 66Mb L: 14/25 MS: 1 CMP- DE: "\002\000\000\000"- 00:08:37.978 [2024-07-13 10:39:54.337850] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:37.978 [2024-07-13 10:39:54.337883] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.978 [2024-07-13 10:39:54.337993] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:37.978 [2024-07-13 10:39:54.338012] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.978 [2024-07-13 10:39:54.338139] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:37.978 [2024-07-13 10:39:54.338160] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:37.978 [2024-07-13 10:39:54.338287] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:37.978 [2024-07-13 10:39:54.338309] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:38.237 #45 NEW cov: 11756 ft: 14368 corp: 19/315b lim: 25 exec/s: 45 rss: 66Mb L: 23/25 MS: 1 CopyPart- 00:08:38.237 [2024-07-13 10:39:54.398014] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:38.237 [2024-07-13 10:39:54.398052] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.237 [2024-07-13 10:39:54.398165] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:38.237 [2024-07-13 10:39:54.398195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.237 [2024-07-13 10:39:54.398317] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:38.237 [2024-07-13 10:39:54.398339] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.237 [2024-07-13 10:39:54.398465] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:38.237 [2024-07-13 10:39:54.398488] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:38.237 #46 NEW cov: 11756 ft: 14377 corp: 20/336b lim: 25 exec/s: 46 rss: 66Mb L: 21/25 MS: 1 ChangeByte- 00:08:38.237 [2024-07-13 10:39:54.437900] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:38.237 [2024-07-13 10:39:54.437934] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.237 [2024-07-13 10:39:54.438026] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:38.237 [2024-07-13 10:39:54.438047] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.237 [2024-07-13 10:39:54.438173] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:38.237 [2024-07-13 10:39:54.438195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.237 [2024-07-13 10:39:54.438316] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:38.237 [2024-07-13 10:39:54.438334] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:38.237 [2024-07-13 10:39:54.438461] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:38.237 [2024-07-13 10:39:54.438479] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:38.237 #47 NEW cov: 11756 ft: 14388 corp: 21/361b lim: 25 exec/s: 47 rss: 66Mb L: 25/25 MS: 1 CopyPart- 00:08:38.237 [2024-07-13 10:39:54.487966] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:38.237 [2024-07-13 10:39:54.487998] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.237 [2024-07-13 10:39:54.488106] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:38.237 [2024-07-13 10:39:54.488129] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.237 #48 NEW cov: 11756 ft: 14419 corp: 22/374b lim: 25 exec/s: 48 rss: 66Mb L: 13/25 MS: 1 ChangeBit- 00:08:38.237 [2024-07-13 10:39:54.527990] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:38.237 [2024-07-13 10:39:54.528020] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.237 [2024-07-13 10:39:54.528142] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:38.237 [2024-07-13 10:39:54.528162] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.237 #49 NEW cov: 11756 ft: 14432 corp: 23/388b lim: 25 exec/s: 49 rss: 66Mb L: 14/25 MS: 1 ChangeBit- 00:08:38.237 [2024-07-13 10:39:54.567616] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:38.237 [2024-07-13 10:39:54.567645] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.237 [2024-07-13 10:39:54.567754] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:38.237 [2024-07-13 10:39:54.567778] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.237 #50 NEW cov: 11756 ft: 14468 corp: 24/402b lim: 25 exec/s: 50 rss: 66Mb L: 14/25 MS: 1 ChangeBit- 00:08:38.237 [2024-07-13 10:39:54.618322] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:38.237 [2024-07-13 10:39:54.618346] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.237 [2024-07-13 10:39:54.618480] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:38.237 [2024-07-13 10:39:54.618502] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.496 #51 NEW cov: 11756 ft: 14542 corp: 25/416b lim: 25 exec/s: 51 rss: 66Mb L: 14/25 MS: 1 ChangeByte- 00:08:38.496 [2024-07-13 10:39:54.668566] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:38.496 [2024-07-13 10:39:54.668599] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.496 [2024-07-13 10:39:54.668730] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:38.496 [2024-07-13 10:39:54.668752] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.496 [2024-07-13 10:39:54.668881] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:38.496 [2024-07-13 10:39:54.668912] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.496 #52 NEW cov: 11756 ft: 14549 corp: 26/434b lim: 25 exec/s: 52 rss: 66Mb L: 18/25 MS: 1 ChangeBinInt- 00:08:38.497 [2024-07-13 10:39:54.708641] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:38.497 [2024-07-13 10:39:54.708671] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.497 [2024-07-13 10:39:54.708812] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:38.497 [2024-07-13 10:39:54.708835] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.497 [2024-07-13 10:39:54.708957] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:38.497 [2024-07-13 10:39:54.708979] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.497 #53 NEW cov: 11756 ft: 14558 corp: 27/452b lim: 25 exec/s: 53 rss: 67Mb L: 18/25 MS: 1 ChangeBinInt- 00:08:38.497 [2024-07-13 10:39:54.759014] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:38.497 [2024-07-13 10:39:54.759043] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.497 [2024-07-13 10:39:54.759134] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:38.497 [2024-07-13 10:39:54.759158] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.497 [2024-07-13 10:39:54.759289] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:38.497 [2024-07-13 10:39:54.759311] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.497 [2024-07-13 10:39:54.759438] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:38.497 [2024-07-13 10:39:54.759461] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:38.497 #54 NEW cov: 11756 ft: 14569 corp: 28/476b lim: 25 exec/s: 54 rss: 67Mb L: 24/25 MS: 1 ChangeBinInt- 00:08:38.497 [2024-07-13 10:39:54.808973] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:38.497 [2024-07-13 10:39:54.809004] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.497 [2024-07-13 10:39:54.809119] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:38.497 [2024-07-13 10:39:54.809139] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.497 [2024-07-13 10:39:54.809263] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:38.497 [2024-07-13 10:39:54.809287] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.497 #55 NEW cov: 11756 ft: 14576 corp: 29/495b lim: 25 exec/s: 55 rss: 67Mb L: 19/25 MS: 1 InsertByte- 00:08:38.497 [2024-07-13 10:39:54.848728] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:38.497 [2024-07-13 10:39:54.848753] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.497 [2024-07-13 10:39:54.848878] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:38.497 [2024-07-13 10:39:54.848911] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.497 #56 NEW cov: 11756 ft: 14612 corp: 30/509b lim: 25 exec/s: 56 rss: 67Mb L: 14/25 MS: 1 PersAutoDict- DE: "\001\000\000\000"- 00:08:38.757 [2024-07-13 10:39:54.899549] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:38.757 [2024-07-13 10:39:54.899580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.757 [2024-07-13 10:39:54.899674] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:38.757 [2024-07-13 10:39:54.899692] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.757 [2024-07-13 10:39:54.899790] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:38.757 [2024-07-13 10:39:54.899810] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.757 [2024-07-13 10:39:54.899931] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:38.757 [2024-07-13 10:39:54.899951] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:38.757 #57 NEW cov: 11756 ft: 14635 corp: 31/531b lim: 25 exec/s: 57 rss: 67Mb L: 22/25 MS: 1 InsertRepeatedBytes- 00:08:38.757 [2024-07-13 10:39:54.949748] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:38.757 [2024-07-13 10:39:54.949781] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.757 [2024-07-13 10:39:54.949872] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:38.757 [2024-07-13 10:39:54.949897] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.757 [2024-07-13 10:39:54.950026] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:38.757 [2024-07-13 10:39:54.950048] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.757 [2024-07-13 10:39:54.950173] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:38.757 [2024-07-13 10:39:54.950199] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:38.757 [2024-07-13 10:39:54.950317] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:38.757 [2024-07-13 10:39:54.950339] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:38.757 #58 NEW cov: 11756 ft: 14707 corp: 32/556b lim: 25 exec/s: 58 rss: 67Mb L: 25/25 MS: 1 ChangeBit- 00:08:38.757 [2024-07-13 10:39:54.999700] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:38.757 [2024-07-13 10:39:54.999733] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.757 [2024-07-13 10:39:54.999828] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:38.757 [2024-07-13 10:39:54.999847] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.757 [2024-07-13 10:39:54.999956] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:38.757 [2024-07-13 10:39:54.999975] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.757 [2024-07-13 10:39:55.000096] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:38.757 [2024-07-13 10:39:55.000116] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:38.757 #59 NEW cov: 11756 ft: 14715 corp: 33/579b lim: 25 exec/s: 59 rss: 67Mb L: 23/25 MS: 1 InsertRepeatedBytes- 00:08:38.757 [2024-07-13 10:39:55.050098] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:38.757 [2024-07-13 10:39:55.050131] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.757 [2024-07-13 10:39:55.050216] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:38.757 [2024-07-13 10:39:55.050239] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.757 [2024-07-13 10:39:55.050363] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:38.757 [2024-07-13 10:39:55.050383] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.757 [2024-07-13 10:39:55.050520] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:38.757 [2024-07-13 10:39:55.050543] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:38.757 [2024-07-13 10:39:55.050667] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:38.757 [2024-07-13 10:39:55.050691] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:38.757 #60 NEW cov: 11756 ft: 14727 corp: 34/604b lim: 25 exec/s: 60 rss: 67Mb L: 25/25 MS: 1 ChangeBit- 00:08:38.757 [2024-07-13 10:39:55.089753] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:38.757 [2024-07-13 10:39:55.089788] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.757 [2024-07-13 10:39:55.089908] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:38.757 [2024-07-13 10:39:55.089931] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.757 [2024-07-13 10:39:55.090055] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:38.757 [2024-07-13 10:39:55.090077] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.757 #61 NEW cov: 11756 ft: 14747 corp: 35/622b lim: 25 exec/s: 61 rss: 67Mb L: 18/25 MS: 1 PersAutoDict- DE: "\002\000\000\000"- 00:08:38.757 [2024-07-13 10:39:55.129963] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:38.757 [2024-07-13 10:39:55.129993] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.757 [2024-07-13 10:39:55.130088] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:38.757 [2024-07-13 10:39:55.130106] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.757 [2024-07-13 10:39:55.130225] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:38.757 [2024-07-13 10:39:55.130244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.757 [2024-07-13 10:39:55.130375] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:38.757 [2024-07-13 10:39:55.130398] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:38.757 [2024-07-13 10:39:55.130524] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:38.757 [2024-07-13 10:39:55.130546] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:39.016 #62 NEW cov: 11756 ft: 14767 corp: 36/647b lim: 25 exec/s: 62 rss: 67Mb L: 25/25 MS: 1 InsertByte- 00:08:39.016 [2024-07-13 10:39:55.180040] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:39.016 [2024-07-13 10:39:55.180076] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.016 [2024-07-13 10:39:55.180193] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:39.016 [2024-07-13 10:39:55.180218] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.016 [2024-07-13 10:39:55.180341] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:39.016 [2024-07-13 10:39:55.180364] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:39.016 [2024-07-13 10:39:55.180489] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:39.016 [2024-07-13 10:39:55.180511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:39.016 #63 NEW cov: 11756 ft: 14770 corp: 37/671b lim: 25 exec/s: 63 rss: 67Mb L: 24/25 MS: 1 CopyPart- 00:08:39.016 [2024-07-13 10:39:55.220427] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:39.016 [2024-07-13 10:39:55.220462] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.016 [2024-07-13 10:39:55.220559] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:39.017 [2024-07-13 10:39:55.220581] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.017 [2024-07-13 10:39:55.220706] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:39.017 [2024-07-13 10:39:55.220725] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:39.017 [2024-07-13 10:39:55.220845] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:39.017 [2024-07-13 10:39:55.220864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:39.017 #64 NEW cov: 11756 ft: 14775 corp: 38/695b lim: 25 exec/s: 64 rss: 67Mb L: 24/25 MS: 1 ChangeBinInt- 00:08:39.017 [2024-07-13 10:39:55.260304] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:39.017 [2024-07-13 10:39:55.260334] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.017 [2024-07-13 10:39:55.260429] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:39.017 [2024-07-13 10:39:55.260460] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.017 [2024-07-13 10:39:55.260604] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:39.017 [2024-07-13 10:39:55.260625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:39.017 #65 NEW cov: 11756 ft: 14781 corp: 39/712b lim: 25 exec/s: 32 rss: 67Mb L: 17/25 MS: 1 CrossOver- 00:08:39.017 #65 DONE cov: 11756 ft: 14781 corp: 39/712b lim: 25 exec/s: 32 rss: 67Mb 00:08:39.017 ###### Recommended dictionary. ###### 00:08:39.017 "\001\000\000\000" # Uses: 2 00:08:39.017 "\002\000\000\000" # Uses: 1 00:08:39.017 ###### End of recommended dictionary. ###### 00:08:39.017 Done 65 runs in 2 second(s) 00:08:39.017 10:39:55 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_23.conf 00:08:39.017 10:39:55 -- ../common.sh@72 -- # (( i++ )) 00:08:39.017 10:39:55 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:39.017 10:39:55 -- ../common.sh@73 -- # start_llvm_fuzz 24 1 0x1 00:08:39.017 10:39:55 -- nvmf/run.sh@23 -- # local fuzzer_type=24 00:08:39.017 10:39:55 -- nvmf/run.sh@24 -- # local timen=1 00:08:39.017 10:39:55 -- nvmf/run.sh@25 -- # local core=0x1 00:08:39.017 10:39:55 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:08:39.017 10:39:55 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_24.conf 00:08:39.017 10:39:55 -- nvmf/run.sh@29 -- # printf %02d 24 00:08:39.276 10:39:55 -- nvmf/run.sh@29 -- # port=4424 00:08:39.276 10:39:55 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:08:39.276 10:39:55 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4424' 00:08:39.276 10:39:55 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4424"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:39.276 10:39:55 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4424' -c /tmp/fuzz_json_24.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 -Z 24 -r /var/tmp/spdk24.sock 00:08:39.276 [2024-07-13 10:39:55.438007] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:08:39.276 [2024-07-13 10:39:55.438077] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2000173 ] 00:08:39.276 EAL: No free 2048 kB hugepages reported on node 1 00:08:39.276 [2024-07-13 10:39:55.613527] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:39.276 [2024-07-13 10:39:55.633025] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:39.276 [2024-07-13 10:39:55.633146] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:39.535 [2024-07-13 10:39:55.684513] tcp.c: 659:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:39.535 [2024-07-13 10:39:55.700786] tcp.c: 952:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4424 *** 00:08:39.535 INFO: Running with entropic power schedule (0xFF, 100). 00:08:39.535 INFO: Seed: 1895703194 00:08:39.535 INFO: Loaded 1 modules (341341 inline 8-bit counters): 341341 [0x26ac74c, 0x26ffca9), 00:08:39.535 INFO: Loaded 1 PC tables (341341 PCs): 341341 [0x26ffcb0,0x2c35280), 00:08:39.535 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:08:39.536 INFO: A corpus is not provided, starting from an empty corpus 00:08:39.536 #2 INITED exec/s: 0 rss: 60Mb 00:08:39.536 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:39.536 This may also happen if the target rejected all inputs we tried so far 00:08:39.536 [2024-07-13 10:39:55.756260] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:184549376 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.536 [2024-07-13 10:39:55.756291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.536 [2024-07-13 10:39:55.756347] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.536 [2024-07-13 10:39:55.756362] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.536 [2024-07-13 10:39:55.756419] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.536 [2024-07-13 10:39:55.756434] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:39.795 NEW_FUNC[1/672]: 0x4c9ff0 in fuzz_nvm_compare_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:685 00:08:39.795 NEW_FUNC[2/672]: 0x4dac50 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:39.795 #5 NEW cov: 11597 ft: 11593 corp: 2/76b lim: 100 exec/s: 0 rss: 68Mb L: 75/75 MS: 3 CopyPart-ChangeBit-InsertRepeatedBytes- 00:08:39.795 [2024-07-13 10:39:56.066674] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:179961856 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.795 [2024-07-13 10:39:56.066705] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.795 #10 NEW cov: 11714 ft: 12943 corp: 3/109b lim: 100 exec/s: 0 rss: 68Mb L: 33/75 MS: 5 CopyPart-CrossOver-ChangeByte-ChangeByte-InsertRepeatedBytes- 00:08:39.795 [2024-07-13 10:39:56.107148] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.795 [2024-07-13 10:39:56.107177] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.795 [2024-07-13 10:39:56.107216] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.795 [2024-07-13 10:39:56.107231] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.795 [2024-07-13 10:39:56.107281] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.795 [2024-07-13 10:39:56.107297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:39.795 [2024-07-13 10:39:56.107350] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.795 [2024-07-13 10:39:56.107365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:39.795 #12 NEW cov: 11720 ft: 13476 corp: 4/200b lim: 100 exec/s: 0 rss: 68Mb L: 91/91 MS: 2 ShuffleBytes-InsertRepeatedBytes- 00:08:39.795 [2024-07-13 10:39:56.147239] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.795 [2024-07-13 10:39:56.147270] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.795 [2024-07-13 10:39:56.147312] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.795 [2024-07-13 10:39:56.147326] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.795 [2024-07-13 10:39:56.147378] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.795 [2024-07-13 10:39:56.147393] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:39.795 [2024-07-13 10:39:56.147450] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:39.795 [2024-07-13 10:39:56.147465] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:39.795 #13 NEW cov: 11805 ft: 13775 corp: 5/291b lim: 100 exec/s: 0 rss: 68Mb L: 91/91 MS: 1 ShuffleBytes- 00:08:40.054 [2024-07-13 10:39:56.187194] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:184549376 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.054 [2024-07-13 10:39:56.187222] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.054 [2024-07-13 10:39:56.187264] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.054 [2024-07-13 10:39:56.187279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.054 [2024-07-13 10:39:56.187333] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.054 [2024-07-13 10:39:56.187348] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:40.054 #14 NEW cov: 11805 ft: 13837 corp: 6/366b lim: 100 exec/s: 0 rss: 68Mb L: 75/91 MS: 1 ShuffleBytes- 00:08:40.054 [2024-07-13 10:39:56.227455] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.054 [2024-07-13 10:39:56.227483] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.054 [2024-07-13 10:39:56.227523] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.054 [2024-07-13 10:39:56.227538] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.054 [2024-07-13 10:39:56.227591] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.054 [2024-07-13 10:39:56.227606] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:40.054 [2024-07-13 10:39:56.227659] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.054 [2024-07-13 10:39:56.227673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:40.054 #15 NEW cov: 11805 ft: 13879 corp: 7/458b lim: 100 exec/s: 0 rss: 68Mb L: 92/92 MS: 1 InsertByte- 00:08:40.054 [2024-07-13 10:39:56.267551] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.054 [2024-07-13 10:39:56.267578] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.054 [2024-07-13 10:39:56.267622] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.054 [2024-07-13 10:39:56.267637] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.054 [2024-07-13 10:39:56.267690] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.054 [2024-07-13 10:39:56.267704] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:40.054 [2024-07-13 10:39:56.267757] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.054 [2024-07-13 10:39:56.267772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:40.054 #16 NEW cov: 11805 ft: 13997 corp: 8/549b lim: 100 exec/s: 0 rss: 68Mb L: 91/92 MS: 1 ChangeBit- 00:08:40.054 [2024-07-13 10:39:56.307712] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.054 [2024-07-13 10:39:56.307742] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.054 [2024-07-13 10:39:56.307777] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.054 [2024-07-13 10:39:56.307793] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.054 [2024-07-13 10:39:56.307845] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.054 [2024-07-13 10:39:56.307861] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:40.054 [2024-07-13 10:39:56.307913] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:1663823974888374272 len:5889 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.054 [2024-07-13 10:39:56.307929] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:40.054 #17 NEW cov: 11805 ft: 14011 corp: 9/647b lim: 100 exec/s: 0 rss: 68Mb L: 98/98 MS: 1 InsertRepeatedBytes- 00:08:40.054 [2024-07-13 10:39:56.347715] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:184549376 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.054 [2024-07-13 10:39:56.347743] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.054 [2024-07-13 10:39:56.347786] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.054 [2024-07-13 10:39:56.347801] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.054 [2024-07-13 10:39:56.347856] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.054 [2024-07-13 10:39:56.347872] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:40.054 #23 NEW cov: 11805 ft: 14091 corp: 10/722b lim: 100 exec/s: 0 rss: 68Mb L: 75/98 MS: 1 CrossOver- 00:08:40.054 [2024-07-13 10:39:56.387539] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.054 [2024-07-13 10:39:56.387566] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.054 #24 NEW cov: 11805 ft: 14198 corp: 11/758b lim: 100 exec/s: 0 rss: 68Mb L: 36/98 MS: 1 CrossOver- 00:08:40.054 [2024-07-13 10:39:56.428123] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.054 [2024-07-13 10:39:56.428157] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.054 [2024-07-13 10:39:56.428209] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:5116089176692883456 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.054 [2024-07-13 10:39:56.428224] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.054 [2024-07-13 10:39:56.428278] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.054 [2024-07-13 10:39:56.428295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:40.054 [2024-07-13 10:39:56.428349] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:6499312016031744 len:5912 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.054 [2024-07-13 10:39:56.428364] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:40.313 #25 NEW cov: 11805 ft: 14216 corp: 12/857b lim: 100 exec/s: 0 rss: 69Mb L: 99/99 MS: 1 InsertByte- 00:08:40.313 [2024-07-13 10:39:56.467861] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:179961856 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.313 [2024-07-13 10:39:56.467889] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.313 [2024-07-13 10:39:56.467943] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.313 [2024-07-13 10:39:56.467958] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.313 #31 NEW cov: 11805 ft: 14540 corp: 13/901b lim: 100 exec/s: 0 rss: 69Mb L: 44/99 MS: 1 CopyPart- 00:08:40.313 [2024-07-13 10:39:56.508264] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:184549376 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.313 [2024-07-13 10:39:56.508292] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.313 [2024-07-13 10:39:56.508335] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.313 [2024-07-13 10:39:56.508350] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.314 [2024-07-13 10:39:56.508404] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.314 [2024-07-13 10:39:56.508421] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:40.314 [2024-07-13 10:39:56.508474] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.314 [2024-07-13 10:39:56.508488] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:40.314 #32 NEW cov: 11805 ft: 14607 corp: 14/993b lim: 100 exec/s: 0 rss: 69Mb L: 92/99 MS: 1 InsertRepeatedBytes- 00:08:40.314 [2024-07-13 10:39:56.548309] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:184549376 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.314 [2024-07-13 10:39:56.548338] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.314 [2024-07-13 10:39:56.548377] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.314 [2024-07-13 10:39:56.548397] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.314 [2024-07-13 10:39:56.548453] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.314 [2024-07-13 10:39:56.548469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:40.314 #33 NEW cov: 11805 ft: 14643 corp: 15/1068b lim: 100 exec/s: 0 rss: 69Mb L: 75/99 MS: 1 ShuffleBytes- 00:08:40.314 [2024-07-13 10:39:56.578082] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.314 [2024-07-13 10:39:56.578111] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.314 #34 NEW cov: 11805 ft: 14715 corp: 16/1100b lim: 100 exec/s: 0 rss: 69Mb L: 32/99 MS: 1 EraseBytes- 00:08:40.314 [2024-07-13 10:39:56.618660] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.314 [2024-07-13 10:39:56.618689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.314 [2024-07-13 10:39:56.618725] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:5116089176692883456 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.314 [2024-07-13 10:39:56.618741] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.314 [2024-07-13 10:39:56.618795] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.314 [2024-07-13 10:39:56.618810] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:40.314 [2024-07-13 10:39:56.618865] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:6499312016031744 len:5912 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.314 [2024-07-13 10:39:56.618881] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:40.314 NEW_FUNC[1/1]: 0x197bcf0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:40.314 #35 NEW cov: 11828 ft: 14806 corp: 17/1199b lim: 100 exec/s: 0 rss: 69Mb L: 99/99 MS: 1 CMP- DE: "\002\000\000\000\000\000\000\000"- 00:08:40.314 [2024-07-13 10:39:56.668716] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.314 [2024-07-13 10:39:56.668744] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.314 [2024-07-13 10:39:56.668782] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.314 [2024-07-13 10:39:56.668798] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.314 [2024-07-13 10:39:56.668851] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.314 [2024-07-13 10:39:56.668867] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:40.314 [2024-07-13 10:39:56.668920] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:385875968 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.314 [2024-07-13 10:39:56.668936] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:40.314 #36 NEW cov: 11828 ft: 14840 corp: 18/1287b lim: 100 exec/s: 0 rss: 69Mb L: 88/99 MS: 1 EraseBytes- 00:08:40.574 [2024-07-13 10:39:56.708862] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.574 [2024-07-13 10:39:56.708894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.574 [2024-07-13 10:39:56.708929] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:5116089176692883456 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.574 [2024-07-13 10:39:56.708944] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.574 [2024-07-13 10:39:56.708995] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.574 [2024-07-13 10:39:56.709010] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:40.574 [2024-07-13 10:39:56.709063] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:6499312016031744 len:5912 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.574 [2024-07-13 10:39:56.709078] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:40.574 #37 NEW cov: 11828 ft: 14850 corp: 19/1386b lim: 100 exec/s: 37 rss: 69Mb L: 99/99 MS: 1 PersAutoDict- DE: "\002\000\000\000\000\000\000\000"- 00:08:40.574 [2024-07-13 10:39:56.748831] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:184549376 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.574 [2024-07-13 10:39:56.748859] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.574 [2024-07-13 10:39:56.748895] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.574 [2024-07-13 10:39:56.748911] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.574 [2024-07-13 10:39:56.748966] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.574 [2024-07-13 10:39:56.748982] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:40.574 #38 NEW cov: 11828 ft: 14906 corp: 20/1461b lim: 100 exec/s: 38 rss: 70Mb L: 75/99 MS: 1 ChangeByte- 00:08:40.574 [2024-07-13 10:39:56.788946] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:179961856 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.574 [2024-07-13 10:39:56.788973] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.574 [2024-07-13 10:39:56.789010] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.574 [2024-07-13 10:39:56.789027] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.574 [2024-07-13 10:39:56.789080] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.574 [2024-07-13 10:39:56.789094] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:40.574 #39 NEW cov: 11828 ft: 14918 corp: 21/1524b lim: 100 exec/s: 39 rss: 70Mb L: 63/99 MS: 1 CrossOver- 00:08:40.574 [2024-07-13 10:39:56.829239] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:184549376 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.574 [2024-07-13 10:39:56.829265] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.574 [2024-07-13 10:39:56.829311] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.574 [2024-07-13 10:39:56.829329] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.574 [2024-07-13 10:39:56.829381] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.574 [2024-07-13 10:39:56.829397] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:40.574 [2024-07-13 10:39:56.829449] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.574 [2024-07-13 10:39:56.829460] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:40.574 #40 NEW cov: 11828 ft: 14937 corp: 22/1617b lim: 100 exec/s: 40 rss: 70Mb L: 93/99 MS: 1 InsertByte- 00:08:40.574 [2024-07-13 10:39:56.869327] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:184549376 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.574 [2024-07-13 10:39:56.869353] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.574 [2024-07-13 10:39:56.869400] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.574 [2024-07-13 10:39:56.869415] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.574 [2024-07-13 10:39:56.869487] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.574 [2024-07-13 10:39:56.869503] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:40.574 [2024-07-13 10:39:56.869557] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.574 [2024-07-13 10:39:56.869573] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:40.574 #41 NEW cov: 11828 ft: 14950 corp: 23/1710b lim: 100 exec/s: 41 rss: 70Mb L: 93/99 MS: 1 ChangeBinInt- 00:08:40.574 [2024-07-13 10:39:56.909446] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.574 [2024-07-13 10:39:56.909472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.574 [2024-07-13 10:39:56.909519] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.574 [2024-07-13 10:39:56.909535] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.574 [2024-07-13 10:39:56.909585] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.574 [2024-07-13 10:39:56.909601] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:40.574 [2024-07-13 10:39:56.909653] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:385875968 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.574 [2024-07-13 10:39:56.909668] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:40.574 #42 NEW cov: 11828 ft: 14974 corp: 24/1798b lim: 100 exec/s: 42 rss: 70Mb L: 88/99 MS: 1 CrossOver- 00:08:40.574 [2024-07-13 10:39:56.949548] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:66816 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.574 [2024-07-13 10:39:56.949575] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.574 [2024-07-13 10:39:56.949615] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:5116089176692883456 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.574 [2024-07-13 10:39:56.949631] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.574 [2024-07-13 10:39:56.949682] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.574 [2024-07-13 10:39:56.949697] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:40.575 [2024-07-13 10:39:56.949750] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:6499312016031744 len:5912 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.575 [2024-07-13 10:39:56.949765] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:40.834 #43 NEW cov: 11828 ft: 14997 corp: 25/1897b lim: 100 exec/s: 43 rss: 70Mb L: 99/99 MS: 1 ChangeBinInt- 00:08:40.834 [2024-07-13 10:39:56.989190] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:179961856 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.834 [2024-07-13 10:39:56.989216] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.834 #44 NEW cov: 11828 ft: 15043 corp: 26/1933b lim: 100 exec/s: 44 rss: 70Mb L: 36/99 MS: 1 EraseBytes- 00:08:40.834 [2024-07-13 10:39:57.029335] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:179961856 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.834 [2024-07-13 10:39:57.029361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.834 #45 NEW cov: 11828 ft: 15112 corp: 27/1966b lim: 100 exec/s: 45 rss: 70Mb L: 33/99 MS: 1 ChangeBit- 00:08:40.834 [2024-07-13 10:39:57.069809] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:179961856 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.834 [2024-07-13 10:39:57.069836] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.834 [2024-07-13 10:39:57.069873] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.834 [2024-07-13 10:39:57.069888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.834 [2024-07-13 10:39:57.069941] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:11284894768911129756 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.835 [2024-07-13 10:39:57.069958] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:40.835 #46 NEW cov: 11828 ft: 15154 corp: 28/2043b lim: 100 exec/s: 46 rss: 70Mb L: 77/99 MS: 1 InsertRepeatedBytes- 00:08:40.835 [2024-07-13 10:39:57.110013] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.835 [2024-07-13 10:39:57.110040] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.835 [2024-07-13 10:39:57.110082] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.835 [2024-07-13 10:39:57.110097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.835 [2024-07-13 10:39:57.110150] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.835 [2024-07-13 10:39:57.110164] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:40.835 [2024-07-13 10:39:57.110219] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:385875968 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.835 [2024-07-13 10:39:57.110233] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:40.835 #47 NEW cov: 11828 ft: 15165 corp: 29/2141b lim: 100 exec/s: 47 rss: 70Mb L: 98/99 MS: 1 CopyPart- 00:08:40.835 [2024-07-13 10:39:57.149816] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:179961856 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.835 [2024-07-13 10:39:57.149843] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.835 [2024-07-13 10:39:57.149889] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:45 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.835 [2024-07-13 10:39:57.149905] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.835 #48 NEW cov: 11828 ft: 15210 corp: 30/2185b lim: 100 exec/s: 48 rss: 70Mb L: 44/99 MS: 1 ChangeBinInt- 00:08:40.835 [2024-07-13 10:39:57.189790] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:179961856 len:513 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:40.835 [2024-07-13 10:39:57.189817] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.835 #49 NEW cov: 11828 ft: 15247 corp: 31/2218b lim: 100 exec/s: 49 rss: 70Mb L: 33/99 MS: 1 ChangeBit- 00:08:41.093 [2024-07-13 10:39:57.230203] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:184549376 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:41.093 [2024-07-13 10:39:57.230231] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.093 [2024-07-13 10:39:57.230265] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:12384898975268864 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:41.093 [2024-07-13 10:39:57.230281] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.093 [2024-07-13 10:39:57.230333] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:41.093 [2024-07-13 10:39:57.230348] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:41.093 #55 NEW cov: 11828 ft: 15273 corp: 32/2293b lim: 100 exec/s: 55 rss: 70Mb L: 75/99 MS: 1 ChangeByte- 00:08:41.093 [2024-07-13 10:39:57.270538] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:184549376 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:41.093 [2024-07-13 10:39:57.270565] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.093 [2024-07-13 10:39:57.270604] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:41.093 [2024-07-13 10:39:57.270620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.093 [2024-07-13 10:39:57.270673] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:41.093 [2024-07-13 10:39:57.270689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:41.093 [2024-07-13 10:39:57.270740] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:41.093 [2024-07-13 10:39:57.270756] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:41.093 #56 NEW cov: 11828 ft: 15282 corp: 33/2385b lim: 100 exec/s: 56 rss: 70Mb L: 92/99 MS: 1 CopyPart- 00:08:41.093 [2024-07-13 10:39:57.310126] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:184549376 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:41.093 [2024-07-13 10:39:57.310153] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.093 #57 NEW cov: 11828 ft: 15290 corp: 34/2420b lim: 100 exec/s: 57 rss: 70Mb L: 35/99 MS: 1 CrossOver- 00:08:41.093 [2024-07-13 10:39:57.350374] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:144115188255817728 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:41.093 [2024-07-13 10:39:57.350401] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.093 [2024-07-13 10:39:57.350452] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:41.093 [2024-07-13 10:39:57.350469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.093 #58 NEW cov: 11828 ft: 15339 corp: 35/2464b lim: 100 exec/s: 58 rss: 70Mb L: 44/99 MS: 1 PersAutoDict- DE: "\002\000\000\000\000\000\000\000"- 00:08:41.093 [2024-07-13 10:39:57.390890] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:41.093 [2024-07-13 10:39:57.390916] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.093 [2024-07-13 10:39:57.390961] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:41.093 [2024-07-13 10:39:57.390976] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.093 [2024-07-13 10:39:57.391030] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:41.093 [2024-07-13 10:39:57.391043] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:41.093 [2024-07-13 10:39:57.391098] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:41.093 [2024-07-13 10:39:57.391114] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:41.093 #59 NEW cov: 11828 ft: 15357 corp: 36/2563b lim: 100 exec/s: 59 rss: 70Mb L: 99/99 MS: 1 InsertRepeatedBytes- 00:08:41.093 [2024-07-13 10:39:57.430939] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:179961856 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:41.093 [2024-07-13 10:39:57.430966] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.093 [2024-07-13 10:39:57.431004] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:41.093 [2024-07-13 10:39:57.431020] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.093 [2024-07-13 10:39:57.431074] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:41.093 [2024-07-13 10:39:57.431091] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:41.093 [2024-07-13 10:39:57.431145] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:41.093 [2024-07-13 10:39:57.431175] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:41.093 #60 NEW cov: 11828 ft: 15427 corp: 37/2654b lim: 100 exec/s: 60 rss: 70Mb L: 91/99 MS: 1 InsertRepeatedBytes- 00:08:41.093 [2024-07-13 10:39:57.471036] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:179961856 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:41.093 [2024-07-13 10:39:57.471063] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.094 [2024-07-13 10:39:57.471100] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:41.094 [2024-07-13 10:39:57.471116] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.094 [2024-07-13 10:39:57.471168] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:41.094 [2024-07-13 10:39:57.471184] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:41.094 [2024-07-13 10:39:57.471237] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:41.094 [2024-07-13 10:39:57.471252] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:41.352 #61 NEW cov: 11828 ft: 15437 corp: 38/2745b lim: 100 exec/s: 61 rss: 70Mb L: 91/99 MS: 1 PersAutoDict- DE: "\002\000\000\000\000\000\000\000"- 00:08:41.352 [2024-07-13 10:39:57.511153] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:184549376 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:41.352 [2024-07-13 10:39:57.511179] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.352 [2024-07-13 10:39:57.511217] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:41.352 [2024-07-13 10:39:57.511233] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.352 [2024-07-13 10:39:57.511286] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:41.352 [2024-07-13 10:39:57.511301] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:41.352 [2024-07-13 10:39:57.511356] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:41.352 [2024-07-13 10:39:57.511370] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:41.352 #62 NEW cov: 11828 ft: 15453 corp: 39/2829b lim: 100 exec/s: 62 rss: 70Mb L: 84/99 MS: 1 EraseBytes- 00:08:41.352 [2024-07-13 10:39:57.551246] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:66816 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:41.352 [2024-07-13 10:39:57.551272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.352 [2024-07-13 10:39:57.551319] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:5116089176692883456 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:41.352 [2024-07-13 10:39:57.551334] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.352 [2024-07-13 10:39:57.551386] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:41.352 [2024-07-13 10:39:57.551401] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:41.352 [2024-07-13 10:39:57.551463] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:6499312016031744 len:5912 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:41.352 [2024-07-13 10:39:57.551479] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:41.352 #63 NEW cov: 11828 ft: 15456 corp: 40/2928b lim: 100 exec/s: 63 rss: 70Mb L: 99/99 MS: 1 ShuffleBytes- 00:08:41.352 [2024-07-13 10:39:57.590920] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:179961856 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:41.352 [2024-07-13 10:39:57.590947] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.352 #64 NEW cov: 11828 ft: 15471 corp: 41/2961b lim: 100 exec/s: 64 rss: 70Mb L: 33/99 MS: 1 CopyPart- 00:08:41.352 [2024-07-13 10:39:57.631343] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:179961856 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:41.352 [2024-07-13 10:39:57.631370] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.352 [2024-07-13 10:39:57.631412] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:41.352 [2024-07-13 10:39:57.631427] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.352 [2024-07-13 10:39:57.631500] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:41.352 [2024-07-13 10:39:57.631516] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:41.352 #66 NEW cov: 11828 ft: 15505 corp: 42/3026b lim: 100 exec/s: 66 rss: 70Mb L: 65/99 MS: 2 EraseBytes-InsertRepeatedBytes- 00:08:41.352 [2024-07-13 10:39:57.671157] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:179961856 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:41.352 [2024-07-13 10:39:57.671185] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.352 #67 NEW cov: 11828 ft: 15522 corp: 43/3059b lim: 100 exec/s: 67 rss: 70Mb L: 33/99 MS: 1 ChangeByte- 00:08:41.352 [2024-07-13 10:39:57.711314] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:41.352 [2024-07-13 10:39:57.711341] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.611 #68 NEW cov: 11828 ft: 15530 corp: 44/3091b lim: 100 exec/s: 34 rss: 71Mb L: 32/99 MS: 1 ChangeBit- 00:08:41.611 #68 DONE cov: 11828 ft: 15530 corp: 44/3091b lim: 100 exec/s: 34 rss: 71Mb 00:08:41.611 ###### Recommended dictionary. ###### 00:08:41.611 "\002\000\000\000\000\000\000\000" # Uses: 3 00:08:41.611 ###### End of recommended dictionary. ###### 00:08:41.611 Done 68 runs in 2 second(s) 00:08:41.611 10:39:57 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_24.conf 00:08:41.611 10:39:57 -- ../common.sh@72 -- # (( i++ )) 00:08:41.611 10:39:57 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:41.611 10:39:57 -- nvmf/run.sh@71 -- # trap - SIGINT SIGTERM EXIT 00:08:41.611 00:08:41.611 real 1m1.950s 00:08:41.611 user 1m38.770s 00:08:41.611 sys 0m6.588s 00:08:41.611 10:39:57 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:41.611 10:39:57 -- common/autotest_common.sh@10 -- # set +x 00:08:41.611 ************************************ 00:08:41.611 END TEST nvmf_fuzz 00:08:41.611 ************************************ 00:08:41.611 10:39:57 -- fuzz/llvm.sh@60 -- # for fuzzer in "${fuzzers[@]}" 00:08:41.611 10:39:57 -- fuzz/llvm.sh@61 -- # case "$fuzzer" in 00:08:41.611 10:39:57 -- fuzz/llvm.sh@63 -- # run_test vfio_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/run.sh 00:08:41.611 10:39:57 -- common/autotest_common.sh@1077 -- # '[' 2 -le 1 ']' 00:08:41.611 10:39:57 -- common/autotest_common.sh@1083 -- # xtrace_disable 00:08:41.611 10:39:57 -- common/autotest_common.sh@10 -- # set +x 00:08:41.611 ************************************ 00:08:41.611 START TEST vfio_fuzz 00:08:41.611 ************************************ 00:08:41.611 10:39:57 -- common/autotest_common.sh@1104 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/run.sh 00:08:41.611 * Looking for test storage... 00:08:41.611 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:41.872 10:39:57 -- vfio/run.sh@55 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/common.sh 00:08:41.872 10:39:57 -- setup/common.sh@6 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh 00:08:41.872 10:39:57 -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:08:41.872 10:39:57 -- common/autotest_common.sh@34 -- # set -e 00:08:41.872 10:39:57 -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:08:41.872 10:39:57 -- common/autotest_common.sh@36 -- # shopt -s extglob 00:08:41.872 10:39:57 -- common/autotest_common.sh@38 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh ]] 00:08:41.872 10:39:57 -- common/autotest_common.sh@39 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh 00:08:41.872 10:39:57 -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:08:41.872 10:39:57 -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:08:41.872 10:39:58 -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=n 00:08:41.872 10:39:58 -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:08:41.872 10:39:58 -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:08:41.872 10:39:58 -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:08:41.872 10:39:58 -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:08:41.872 10:39:58 -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:08:41.872 10:39:58 -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:08:41.872 10:39:58 -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:08:41.872 10:39:58 -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:08:41.872 10:39:58 -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:08:41.872 10:39:58 -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:08:41.872 10:39:58 -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:08:41.872 10:39:58 -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:08:41.872 10:39:58 -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:08:41.872 10:39:58 -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:08:41.872 10:39:58 -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:08:41.872 10:39:58 -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:08:41.872 10:39:58 -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:08:41.872 10:39:58 -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:08:41.872 10:39:58 -- common/build_config.sh@22 -- # CONFIG_CET=n 00:08:41.872 10:39:58 -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=n 00:08:41.872 10:39:58 -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:08:41.872 10:39:58 -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:08:41.872 10:39:58 -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:08:41.872 10:39:58 -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:08:41.872 10:39:58 -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:08:41.872 10:39:58 -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:08:41.872 10:39:58 -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:08:41.872 10:39:58 -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:08:41.872 10:39:58 -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:08:41.872 10:39:58 -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:08:41.872 10:39:58 -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB=/usr/lib64/clang/16/lib/libclang_rt.fuzzer_no_main-x86_64.a 00:08:41.872 10:39:58 -- common/build_config.sh@35 -- # CONFIG_FUZZER=y 00:08:41.872 10:39:58 -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:08:41.872 10:39:58 -- common/build_config.sh@37 -- # CONFIG_CRYPTO=n 00:08:41.872 10:39:58 -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:08:41.872 10:39:58 -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:08:41.872 10:39:58 -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:08:41.872 10:39:58 -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR=//var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:08:41.872 10:39:58 -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:08:41.872 10:39:58 -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:08:41.872 10:39:58 -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:08:41.872 10:39:58 -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:08:41.872 10:39:58 -- common/build_config.sh@46 -- # CONFIG_COVERAGE=y 00:08:41.872 10:39:58 -- common/build_config.sh@47 -- # CONFIG_RDMA=y 00:08:41.872 10:39:58 -- common/build_config.sh@48 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:08:41.873 10:39:58 -- common/build_config.sh@49 -- # CONFIG_URING_PATH= 00:08:41.873 10:39:58 -- common/build_config.sh@50 -- # CONFIG_XNVME=n 00:08:41.873 10:39:58 -- common/build_config.sh@51 -- # CONFIG_VFIO_USER=y 00:08:41.873 10:39:58 -- common/build_config.sh@52 -- # CONFIG_ARCH=native 00:08:41.873 10:39:58 -- common/build_config.sh@53 -- # CONFIG_URING_ZNS=n 00:08:41.873 10:39:58 -- common/build_config.sh@54 -- # CONFIG_WERROR=y 00:08:41.873 10:39:58 -- common/build_config.sh@55 -- # CONFIG_HAVE_LIBBSD=n 00:08:41.873 10:39:58 -- common/build_config.sh@56 -- # CONFIG_UBSAN=y 00:08:41.873 10:39:58 -- common/build_config.sh@57 -- # CONFIG_IPSEC_MB_DIR= 00:08:41.873 10:39:58 -- common/build_config.sh@58 -- # CONFIG_GOLANG=n 00:08:41.873 10:39:58 -- common/build_config.sh@59 -- # CONFIG_ISAL=y 00:08:41.873 10:39:58 -- common/build_config.sh@60 -- # CONFIG_IDXD_KERNEL=y 00:08:41.873 10:39:58 -- common/build_config.sh@61 -- # CONFIG_DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:08:41.873 10:39:58 -- common/build_config.sh@62 -- # CONFIG_RDMA_PROV=verbs 00:08:41.873 10:39:58 -- common/build_config.sh@63 -- # CONFIG_APPS=y 00:08:41.873 10:39:58 -- common/build_config.sh@64 -- # CONFIG_SHARED=n 00:08:41.873 10:39:58 -- common/build_config.sh@65 -- # CONFIG_FC_PATH= 00:08:41.873 10:39:58 -- common/build_config.sh@66 -- # CONFIG_DPDK_PKG_CONFIG=n 00:08:41.873 10:39:58 -- common/build_config.sh@67 -- # CONFIG_FC=n 00:08:41.873 10:39:58 -- common/build_config.sh@68 -- # CONFIG_AVAHI=n 00:08:41.873 10:39:58 -- common/build_config.sh@69 -- # CONFIG_FIO_PLUGIN=y 00:08:41.873 10:39:58 -- common/build_config.sh@70 -- # CONFIG_RAID5F=n 00:08:41.873 10:39:58 -- common/build_config.sh@71 -- # CONFIG_EXAMPLES=y 00:08:41.873 10:39:58 -- common/build_config.sh@72 -- # CONFIG_TESTS=y 00:08:41.873 10:39:58 -- common/build_config.sh@73 -- # CONFIG_CRYPTO_MLX5=n 00:08:41.873 10:39:58 -- common/build_config.sh@74 -- # CONFIG_MAX_LCORES= 00:08:41.873 10:39:58 -- common/build_config.sh@75 -- # CONFIG_IPSEC_MB=n 00:08:41.873 10:39:58 -- common/build_config.sh@76 -- # CONFIG_DEBUG=y 00:08:41.873 10:39:58 -- common/build_config.sh@77 -- # CONFIG_DPDK_COMPRESSDEV=n 00:08:41.873 10:39:58 -- common/build_config.sh@78 -- # CONFIG_CROSS_PREFIX= 00:08:41.873 10:39:58 -- common/build_config.sh@79 -- # CONFIG_URING=n 00:08:41.873 10:39:58 -- common/autotest_common.sh@48 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:08:41.873 10:39:58 -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:08:41.873 10:39:58 -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:08:41.873 10:39:58 -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:08:41.873 10:39:58 -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:08:41.873 10:39:58 -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:08:41.873 10:39:58 -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:08:41.873 10:39:58 -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:08:41.873 10:39:58 -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:08:41.873 10:39:58 -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:08:41.873 10:39:58 -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:08:41.873 10:39:58 -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:08:41.873 10:39:58 -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:08:41.873 10:39:58 -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:08:41.873 10:39:58 -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/config.h ]] 00:08:41.873 10:39:58 -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:08:41.873 #define SPDK_CONFIG_H 00:08:41.873 #define SPDK_CONFIG_APPS 1 00:08:41.873 #define SPDK_CONFIG_ARCH native 00:08:41.873 #undef SPDK_CONFIG_ASAN 00:08:41.873 #undef SPDK_CONFIG_AVAHI 00:08:41.873 #undef SPDK_CONFIG_CET 00:08:41.873 #define SPDK_CONFIG_COVERAGE 1 00:08:41.873 #define SPDK_CONFIG_CROSS_PREFIX 00:08:41.873 #undef SPDK_CONFIG_CRYPTO 00:08:41.873 #undef SPDK_CONFIG_CRYPTO_MLX5 00:08:41.873 #undef SPDK_CONFIG_CUSTOMOCF 00:08:41.873 #undef SPDK_CONFIG_DAOS 00:08:41.873 #define SPDK_CONFIG_DAOS_DIR 00:08:41.873 #define SPDK_CONFIG_DEBUG 1 00:08:41.873 #undef SPDK_CONFIG_DPDK_COMPRESSDEV 00:08:41.873 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:08:41.873 #define SPDK_CONFIG_DPDK_INC_DIR //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:08:41.873 #define SPDK_CONFIG_DPDK_LIB_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:08:41.873 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:08:41.873 #define SPDK_CONFIG_ENV /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:08:41.873 #define SPDK_CONFIG_EXAMPLES 1 00:08:41.873 #undef SPDK_CONFIG_FC 00:08:41.873 #define SPDK_CONFIG_FC_PATH 00:08:41.873 #define SPDK_CONFIG_FIO_PLUGIN 1 00:08:41.873 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:08:41.873 #undef SPDK_CONFIG_FUSE 00:08:41.873 #define SPDK_CONFIG_FUZZER 1 00:08:41.873 #define SPDK_CONFIG_FUZZER_LIB /usr/lib64/clang/16/lib/libclang_rt.fuzzer_no_main-x86_64.a 00:08:41.873 #undef SPDK_CONFIG_GOLANG 00:08:41.873 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:08:41.873 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:08:41.873 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:08:41.873 #undef SPDK_CONFIG_HAVE_LIBBSD 00:08:41.873 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:08:41.873 #define SPDK_CONFIG_IDXD 1 00:08:41.873 #define SPDK_CONFIG_IDXD_KERNEL 1 00:08:41.873 #undef SPDK_CONFIG_IPSEC_MB 00:08:41.873 #define SPDK_CONFIG_IPSEC_MB_DIR 00:08:41.873 #define SPDK_CONFIG_ISAL 1 00:08:41.873 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:08:41.873 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:08:41.873 #define SPDK_CONFIG_LIBDIR 00:08:41.873 #undef SPDK_CONFIG_LTO 00:08:41.873 #define SPDK_CONFIG_MAX_LCORES 00:08:41.873 #define SPDK_CONFIG_NVME_CUSE 1 00:08:41.873 #undef SPDK_CONFIG_OCF 00:08:41.873 #define SPDK_CONFIG_OCF_PATH 00:08:41.873 #define SPDK_CONFIG_OPENSSL_PATH 00:08:41.873 #undef SPDK_CONFIG_PGO_CAPTURE 00:08:41.873 #undef SPDK_CONFIG_PGO_USE 00:08:41.873 #define SPDK_CONFIG_PREFIX /usr/local 00:08:41.873 #undef SPDK_CONFIG_RAID5F 00:08:41.873 #undef SPDK_CONFIG_RBD 00:08:41.873 #define SPDK_CONFIG_RDMA 1 00:08:41.873 #define SPDK_CONFIG_RDMA_PROV verbs 00:08:41.873 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:08:41.873 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:08:41.873 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:08:41.873 #undef SPDK_CONFIG_SHARED 00:08:41.873 #undef SPDK_CONFIG_SMA 00:08:41.873 #define SPDK_CONFIG_TESTS 1 00:08:41.873 #undef SPDK_CONFIG_TSAN 00:08:41.873 #define SPDK_CONFIG_UBLK 1 00:08:41.873 #define SPDK_CONFIG_UBSAN 1 00:08:41.873 #undef SPDK_CONFIG_UNIT_TESTS 00:08:41.873 #undef SPDK_CONFIG_URING 00:08:41.873 #define SPDK_CONFIG_URING_PATH 00:08:41.873 #undef SPDK_CONFIG_URING_ZNS 00:08:41.873 #undef SPDK_CONFIG_USDT 00:08:41.873 #undef SPDK_CONFIG_VBDEV_COMPRESS 00:08:41.873 #undef SPDK_CONFIG_VBDEV_COMPRESS_MLX5 00:08:41.873 #define SPDK_CONFIG_VFIO_USER 1 00:08:41.873 #define SPDK_CONFIG_VFIO_USER_DIR 00:08:41.873 #define SPDK_CONFIG_VHOST 1 00:08:41.873 #define SPDK_CONFIG_VIRTIO 1 00:08:41.873 #undef SPDK_CONFIG_VTUNE 00:08:41.873 #define SPDK_CONFIG_VTUNE_DIR 00:08:41.873 #define SPDK_CONFIG_WERROR 1 00:08:41.873 #define SPDK_CONFIG_WPDK_DIR 00:08:41.873 #undef SPDK_CONFIG_XNVME 00:08:41.873 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:08:41.873 10:39:58 -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:08:41.873 10:39:58 -- common/autotest_common.sh@49 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:08:41.873 10:39:58 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:08:41.873 10:39:58 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:08:41.873 10:39:58 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:08:41.873 10:39:58 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:41.873 10:39:58 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:41.873 10:39:58 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:41.873 10:39:58 -- paths/export.sh@5 -- # export PATH 00:08:41.873 10:39:58 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:41.873 10:39:58 -- common/autotest_common.sh@50 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:08:41.873 10:39:58 -- pm/common@6 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:08:41.873 10:39:58 -- pm/common@6 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:08:41.873 10:39:58 -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:08:41.873 10:39:58 -- pm/common@7 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/../../../ 00:08:41.873 10:39:58 -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:08:41.873 10:39:58 -- pm/common@16 -- # TEST_TAG=N/A 00:08:41.873 10:39:58 -- pm/common@17 -- # TEST_TAG_FILE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.run_test_name 00:08:41.873 10:39:58 -- common/autotest_common.sh@52 -- # : 1 00:08:41.873 10:39:58 -- common/autotest_common.sh@53 -- # export RUN_NIGHTLY 00:08:41.873 10:39:58 -- common/autotest_common.sh@56 -- # : 0 00:08:41.873 10:39:58 -- common/autotest_common.sh@57 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:08:41.873 10:39:58 -- common/autotest_common.sh@58 -- # : 0 00:08:41.873 10:39:58 -- common/autotest_common.sh@59 -- # export SPDK_RUN_VALGRIND 00:08:41.873 10:39:58 -- common/autotest_common.sh@60 -- # : 1 00:08:41.873 10:39:58 -- common/autotest_common.sh@61 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:08:41.873 10:39:58 -- common/autotest_common.sh@62 -- # : 0 00:08:41.873 10:39:58 -- common/autotest_common.sh@63 -- # export SPDK_TEST_UNITTEST 00:08:41.873 10:39:58 -- common/autotest_common.sh@64 -- # : 00:08:41.873 10:39:58 -- common/autotest_common.sh@65 -- # export SPDK_TEST_AUTOBUILD 00:08:41.873 10:39:58 -- common/autotest_common.sh@66 -- # : 0 00:08:41.874 10:39:58 -- common/autotest_common.sh@67 -- # export SPDK_TEST_RELEASE_BUILD 00:08:41.874 10:39:58 -- common/autotest_common.sh@68 -- # : 0 00:08:41.874 10:39:58 -- common/autotest_common.sh@69 -- # export SPDK_TEST_ISAL 00:08:41.874 10:39:58 -- common/autotest_common.sh@70 -- # : 0 00:08:41.874 10:39:58 -- common/autotest_common.sh@71 -- # export SPDK_TEST_ISCSI 00:08:41.874 10:39:58 -- common/autotest_common.sh@72 -- # : 0 00:08:41.874 10:39:58 -- common/autotest_common.sh@73 -- # export SPDK_TEST_ISCSI_INITIATOR 00:08:41.874 10:39:58 -- common/autotest_common.sh@74 -- # : 0 00:08:41.874 10:39:58 -- common/autotest_common.sh@75 -- # export SPDK_TEST_NVME 00:08:41.874 10:39:58 -- common/autotest_common.sh@76 -- # : 0 00:08:41.874 10:39:58 -- common/autotest_common.sh@77 -- # export SPDK_TEST_NVME_PMR 00:08:41.874 10:39:58 -- common/autotest_common.sh@78 -- # : 0 00:08:41.874 10:39:58 -- common/autotest_common.sh@79 -- # export SPDK_TEST_NVME_BP 00:08:41.874 10:39:58 -- common/autotest_common.sh@80 -- # : 0 00:08:41.874 10:39:58 -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME_CLI 00:08:41.874 10:39:58 -- common/autotest_common.sh@82 -- # : 0 00:08:41.874 10:39:58 -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_CUSE 00:08:41.874 10:39:58 -- common/autotest_common.sh@84 -- # : 0 00:08:41.874 10:39:58 -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_FDP 00:08:41.874 10:39:58 -- common/autotest_common.sh@86 -- # : 0 00:08:41.874 10:39:58 -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVMF 00:08:41.874 10:39:58 -- common/autotest_common.sh@88 -- # : 0 00:08:41.874 10:39:58 -- common/autotest_common.sh@89 -- # export SPDK_TEST_VFIOUSER 00:08:41.874 10:39:58 -- common/autotest_common.sh@90 -- # : 0 00:08:41.874 10:39:58 -- common/autotest_common.sh@91 -- # export SPDK_TEST_VFIOUSER_QEMU 00:08:41.874 10:39:58 -- common/autotest_common.sh@92 -- # : 1 00:08:41.874 10:39:58 -- common/autotest_common.sh@93 -- # export SPDK_TEST_FUZZER 00:08:41.874 10:39:58 -- common/autotest_common.sh@94 -- # : 1 00:08:41.874 10:39:58 -- common/autotest_common.sh@95 -- # export SPDK_TEST_FUZZER_SHORT 00:08:41.874 10:39:58 -- common/autotest_common.sh@96 -- # : rdma 00:08:41.874 10:39:58 -- common/autotest_common.sh@97 -- # export SPDK_TEST_NVMF_TRANSPORT 00:08:41.874 10:39:58 -- common/autotest_common.sh@98 -- # : 0 00:08:41.874 10:39:58 -- common/autotest_common.sh@99 -- # export SPDK_TEST_RBD 00:08:41.874 10:39:58 -- common/autotest_common.sh@100 -- # : 0 00:08:41.874 10:39:58 -- common/autotest_common.sh@101 -- # export SPDK_TEST_VHOST 00:08:41.874 10:39:58 -- common/autotest_common.sh@102 -- # : 0 00:08:41.874 10:39:58 -- common/autotest_common.sh@103 -- # export SPDK_TEST_BLOCKDEV 00:08:41.874 10:39:58 -- common/autotest_common.sh@104 -- # : 0 00:08:41.874 10:39:58 -- common/autotest_common.sh@105 -- # export SPDK_TEST_IOAT 00:08:41.874 10:39:58 -- common/autotest_common.sh@106 -- # : 0 00:08:41.874 10:39:58 -- common/autotest_common.sh@107 -- # export SPDK_TEST_BLOBFS 00:08:41.874 10:39:58 -- common/autotest_common.sh@108 -- # : 0 00:08:41.874 10:39:58 -- common/autotest_common.sh@109 -- # export SPDK_TEST_VHOST_INIT 00:08:41.874 10:39:58 -- common/autotest_common.sh@110 -- # : 0 00:08:41.874 10:39:58 -- common/autotest_common.sh@111 -- # export SPDK_TEST_LVOL 00:08:41.874 10:39:58 -- common/autotest_common.sh@112 -- # : 0 00:08:41.874 10:39:58 -- common/autotest_common.sh@113 -- # export SPDK_TEST_VBDEV_COMPRESS 00:08:41.874 10:39:58 -- common/autotest_common.sh@114 -- # : 0 00:08:41.874 10:39:58 -- common/autotest_common.sh@115 -- # export SPDK_RUN_ASAN 00:08:41.874 10:39:58 -- common/autotest_common.sh@116 -- # : 1 00:08:41.874 10:39:58 -- common/autotest_common.sh@117 -- # export SPDK_RUN_UBSAN 00:08:41.874 10:39:58 -- common/autotest_common.sh@118 -- # : /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:08:41.874 10:39:58 -- common/autotest_common.sh@119 -- # export SPDK_RUN_EXTERNAL_DPDK 00:08:41.874 10:39:58 -- common/autotest_common.sh@120 -- # : 0 00:08:41.874 10:39:58 -- common/autotest_common.sh@121 -- # export SPDK_RUN_NON_ROOT 00:08:41.874 10:39:58 -- common/autotest_common.sh@122 -- # : 0 00:08:41.874 10:39:58 -- common/autotest_common.sh@123 -- # export SPDK_TEST_CRYPTO 00:08:41.874 10:39:58 -- common/autotest_common.sh@124 -- # : 0 00:08:41.874 10:39:58 -- common/autotest_common.sh@125 -- # export SPDK_TEST_FTL 00:08:41.874 10:39:58 -- common/autotest_common.sh@126 -- # : 0 00:08:41.874 10:39:58 -- common/autotest_common.sh@127 -- # export SPDK_TEST_OCF 00:08:41.874 10:39:58 -- common/autotest_common.sh@128 -- # : 0 00:08:41.874 10:39:58 -- common/autotest_common.sh@129 -- # export SPDK_TEST_VMD 00:08:41.874 10:39:58 -- common/autotest_common.sh@130 -- # : 0 00:08:41.874 10:39:58 -- common/autotest_common.sh@131 -- # export SPDK_TEST_OPAL 00:08:41.874 10:39:58 -- common/autotest_common.sh@132 -- # : v23.11 00:08:41.874 10:39:58 -- common/autotest_common.sh@133 -- # export SPDK_TEST_NATIVE_DPDK 00:08:41.874 10:39:58 -- common/autotest_common.sh@134 -- # : true 00:08:41.874 10:39:58 -- common/autotest_common.sh@135 -- # export SPDK_AUTOTEST_X 00:08:41.874 10:39:58 -- common/autotest_common.sh@136 -- # : 0 00:08:41.874 10:39:58 -- common/autotest_common.sh@137 -- # export SPDK_TEST_RAID5 00:08:41.874 10:39:58 -- common/autotest_common.sh@138 -- # : 0 00:08:41.874 10:39:58 -- common/autotest_common.sh@139 -- # export SPDK_TEST_URING 00:08:41.874 10:39:58 -- common/autotest_common.sh@140 -- # : 0 00:08:41.874 10:39:58 -- common/autotest_common.sh@141 -- # export SPDK_TEST_USDT 00:08:41.874 10:39:58 -- common/autotest_common.sh@142 -- # : 0 00:08:41.874 10:39:58 -- common/autotest_common.sh@143 -- # export SPDK_TEST_USE_IGB_UIO 00:08:41.874 10:39:58 -- common/autotest_common.sh@144 -- # : 0 00:08:41.874 10:39:58 -- common/autotest_common.sh@145 -- # export SPDK_TEST_SCHEDULER 00:08:41.874 10:39:58 -- common/autotest_common.sh@146 -- # : 0 00:08:41.874 10:39:58 -- common/autotest_common.sh@147 -- # export SPDK_TEST_SCANBUILD 00:08:41.874 10:39:58 -- common/autotest_common.sh@148 -- # : 00:08:41.874 10:39:58 -- common/autotest_common.sh@149 -- # export SPDK_TEST_NVMF_NICS 00:08:41.874 10:39:58 -- common/autotest_common.sh@150 -- # : 0 00:08:41.874 10:39:58 -- common/autotest_common.sh@151 -- # export SPDK_TEST_SMA 00:08:41.874 10:39:58 -- common/autotest_common.sh@152 -- # : 0 00:08:41.874 10:39:58 -- common/autotest_common.sh@153 -- # export SPDK_TEST_DAOS 00:08:41.874 10:39:58 -- common/autotest_common.sh@154 -- # : 0 00:08:41.874 10:39:58 -- common/autotest_common.sh@155 -- # export SPDK_TEST_XNVME 00:08:41.874 10:39:58 -- common/autotest_common.sh@156 -- # : 0 00:08:41.874 10:39:58 -- common/autotest_common.sh@157 -- # export SPDK_TEST_ACCEL_DSA 00:08:41.874 10:39:58 -- common/autotest_common.sh@158 -- # : 0 00:08:41.874 10:39:58 -- common/autotest_common.sh@159 -- # export SPDK_TEST_ACCEL_IAA 00:08:41.874 10:39:58 -- common/autotest_common.sh@160 -- # : 0 00:08:41.874 10:39:58 -- common/autotest_common.sh@161 -- # export SPDK_TEST_ACCEL_IOAT 00:08:41.874 10:39:58 -- common/autotest_common.sh@163 -- # : 00:08:41.874 10:39:58 -- common/autotest_common.sh@164 -- # export SPDK_TEST_FUZZER_TARGET 00:08:41.874 10:39:58 -- common/autotest_common.sh@165 -- # : 0 00:08:41.874 10:39:58 -- common/autotest_common.sh@166 -- # export SPDK_TEST_NVMF_MDNS 00:08:41.874 10:39:58 -- common/autotest_common.sh@167 -- # : 0 00:08:41.874 10:39:58 -- common/autotest_common.sh@168 -- # export SPDK_JSONRPC_GO_CLIENT 00:08:41.874 10:39:58 -- common/autotest_common.sh@171 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:08:41.874 10:39:58 -- common/autotest_common.sh@171 -- # SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:08:41.874 10:39:58 -- common/autotest_common.sh@172 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:08:41.874 10:39:58 -- common/autotest_common.sh@172 -- # DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:08:41.874 10:39:58 -- common/autotest_common.sh@173 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:41.874 10:39:58 -- common/autotest_common.sh@173 -- # VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:41.874 10:39:58 -- common/autotest_common.sh@174 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:41.874 10:39:58 -- common/autotest_common.sh@174 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:41.874 10:39:58 -- common/autotest_common.sh@177 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:08:41.874 10:39:58 -- common/autotest_common.sh@177 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:08:41.874 10:39:58 -- common/autotest_common.sh@181 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:08:41.874 10:39:58 -- common/autotest_common.sh@181 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:08:41.874 10:39:58 -- common/autotest_common.sh@185 -- # export PYTHONDONTWRITEBYTECODE=1 00:08:41.874 10:39:58 -- common/autotest_common.sh@185 -- # PYTHONDONTWRITEBYTECODE=1 00:08:41.874 10:39:58 -- common/autotest_common.sh@189 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:08:41.874 10:39:58 -- common/autotest_common.sh@189 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:08:41.874 10:39:58 -- common/autotest_common.sh@190 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:08:41.874 10:39:58 -- common/autotest_common.sh@190 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:08:41.874 10:39:58 -- common/autotest_common.sh@194 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:08:41.874 10:39:58 -- common/autotest_common.sh@195 -- # rm -rf /var/tmp/asan_suppression_file 00:08:41.874 10:39:58 -- common/autotest_common.sh@196 -- # cat 00:08:41.874 10:39:58 -- common/autotest_common.sh@222 -- # echo leak:libfuse3.so 00:08:41.874 10:39:58 -- common/autotest_common.sh@224 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:08:41.874 10:39:58 -- common/autotest_common.sh@224 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:08:41.874 10:39:58 -- common/autotest_common.sh@226 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:08:41.875 10:39:58 -- common/autotest_common.sh@226 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:08:41.875 10:39:58 -- common/autotest_common.sh@228 -- # '[' -z /var/spdk/dependencies ']' 00:08:41.875 10:39:58 -- common/autotest_common.sh@231 -- # export DEPENDENCY_DIR 00:08:41.875 10:39:58 -- common/autotest_common.sh@235 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:08:41.875 10:39:58 -- common/autotest_common.sh@235 -- # SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:08:41.875 10:39:58 -- common/autotest_common.sh@236 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:08:41.875 10:39:58 -- common/autotest_common.sh@236 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:08:41.875 10:39:58 -- common/autotest_common.sh@239 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:08:41.875 10:39:58 -- common/autotest_common.sh@239 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:08:41.875 10:39:58 -- common/autotest_common.sh@240 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:08:41.875 10:39:58 -- common/autotest_common.sh@240 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:08:41.875 10:39:58 -- common/autotest_common.sh@242 -- # export AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:08:41.875 10:39:58 -- common/autotest_common.sh@242 -- # AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:08:41.875 10:39:58 -- common/autotest_common.sh@245 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:08:41.875 10:39:58 -- common/autotest_common.sh@245 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:08:41.875 10:39:58 -- common/autotest_common.sh@248 -- # '[' 0 -eq 0 ']' 00:08:41.875 10:39:58 -- common/autotest_common.sh@249 -- # export valgrind= 00:08:41.875 10:39:58 -- common/autotest_common.sh@249 -- # valgrind= 00:08:41.875 10:39:58 -- common/autotest_common.sh@255 -- # uname -s 00:08:41.875 10:39:58 -- common/autotest_common.sh@255 -- # '[' Linux = Linux ']' 00:08:41.875 10:39:58 -- common/autotest_common.sh@256 -- # HUGEMEM=4096 00:08:41.875 10:39:58 -- common/autotest_common.sh@257 -- # export CLEAR_HUGE=yes 00:08:41.875 10:39:58 -- common/autotest_common.sh@257 -- # CLEAR_HUGE=yes 00:08:41.875 10:39:58 -- common/autotest_common.sh@258 -- # [[ 0 -eq 1 ]] 00:08:41.875 10:39:58 -- common/autotest_common.sh@258 -- # [[ 0 -eq 1 ]] 00:08:41.875 10:39:58 -- common/autotest_common.sh@265 -- # MAKE=make 00:08:41.875 10:39:58 -- common/autotest_common.sh@266 -- # MAKEFLAGS=-j112 00:08:41.875 10:39:58 -- common/autotest_common.sh@282 -- # export HUGEMEM=4096 00:08:41.875 10:39:58 -- common/autotest_common.sh@282 -- # HUGEMEM=4096 00:08:41.875 10:39:58 -- common/autotest_common.sh@284 -- # '[' -z /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output ']' 00:08:41.875 10:39:58 -- common/autotest_common.sh@289 -- # NO_HUGE=() 00:08:41.875 10:39:58 -- common/autotest_common.sh@290 -- # TEST_MODE= 00:08:41.875 10:39:58 -- common/autotest_common.sh@309 -- # [[ -z 2000570 ]] 00:08:41.875 10:39:58 -- common/autotest_common.sh@309 -- # kill -0 2000570 00:08:41.875 10:39:58 -- common/autotest_common.sh@1665 -- # set_test_storage 2147483648 00:08:41.875 10:39:58 -- common/autotest_common.sh@319 -- # [[ -v testdir ]] 00:08:41.875 10:39:58 -- common/autotest_common.sh@321 -- # local requested_size=2147483648 00:08:41.875 10:39:58 -- common/autotest_common.sh@322 -- # local mount target_dir 00:08:41.875 10:39:58 -- common/autotest_common.sh@324 -- # local -A mounts fss sizes avails uses 00:08:41.875 10:39:58 -- common/autotest_common.sh@325 -- # local source fs size avail mount use 00:08:41.875 10:39:58 -- common/autotest_common.sh@327 -- # local storage_fallback storage_candidates 00:08:41.875 10:39:58 -- common/autotest_common.sh@329 -- # mktemp -udt spdk.XXXXXX 00:08:41.875 10:39:58 -- common/autotest_common.sh@329 -- # storage_fallback=/tmp/spdk.gkNmUt 00:08:41.875 10:39:58 -- common/autotest_common.sh@334 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:08:41.875 10:39:58 -- common/autotest_common.sh@336 -- # [[ -n '' ]] 00:08:41.875 10:39:58 -- common/autotest_common.sh@341 -- # [[ -n '' ]] 00:08:41.875 10:39:58 -- common/autotest_common.sh@346 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio /tmp/spdk.gkNmUt/tests/vfio /tmp/spdk.gkNmUt 00:08:41.875 10:39:58 -- common/autotest_common.sh@349 -- # requested_size=2214592512 00:08:41.875 10:39:58 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:08:41.875 10:39:58 -- common/autotest_common.sh@318 -- # df -T 00:08:41.875 10:39:58 -- common/autotest_common.sh@318 -- # grep -v Filesystem 00:08:41.875 10:39:58 -- common/autotest_common.sh@352 -- # mounts["$mount"]=spdk_devtmpfs 00:08:41.875 10:39:58 -- common/autotest_common.sh@352 -- # fss["$mount"]=devtmpfs 00:08:41.875 10:39:58 -- common/autotest_common.sh@353 -- # avails["$mount"]=67108864 00:08:41.875 10:39:58 -- common/autotest_common.sh@353 -- # sizes["$mount"]=67108864 00:08:41.875 10:39:58 -- common/autotest_common.sh@354 -- # uses["$mount"]=0 00:08:41.875 10:39:58 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:08:41.875 10:39:58 -- common/autotest_common.sh@352 -- # mounts["$mount"]=/dev/pmem0 00:08:41.875 10:39:58 -- common/autotest_common.sh@352 -- # fss["$mount"]=ext2 00:08:41.875 10:39:58 -- common/autotest_common.sh@353 -- # avails["$mount"]=954408960 00:08:41.875 10:39:58 -- common/autotest_common.sh@353 -- # sizes["$mount"]=5284429824 00:08:41.875 10:39:58 -- common/autotest_common.sh@354 -- # uses["$mount"]=4330020864 00:08:41.875 10:39:58 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:08:41.875 10:39:58 -- common/autotest_common.sh@352 -- # mounts["$mount"]=spdk_root 00:08:41.875 10:39:58 -- common/autotest_common.sh@352 -- # fss["$mount"]=overlay 00:08:41.875 10:39:58 -- common/autotest_common.sh@353 -- # avails["$mount"]=52777320448 00:08:41.875 10:39:58 -- common/autotest_common.sh@353 -- # sizes["$mount"]=61742317568 00:08:41.875 10:39:58 -- common/autotest_common.sh@354 -- # uses["$mount"]=8964997120 00:08:41.875 10:39:58 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:08:41.875 10:39:58 -- common/autotest_common.sh@352 -- # mounts["$mount"]=tmpfs 00:08:41.875 10:39:58 -- common/autotest_common.sh@352 -- # fss["$mount"]=tmpfs 00:08:41.875 10:39:58 -- common/autotest_common.sh@353 -- # avails["$mount"]=30868566016 00:08:41.875 10:39:58 -- common/autotest_common.sh@353 -- # sizes["$mount"]=30871158784 00:08:41.875 10:39:58 -- common/autotest_common.sh@354 -- # uses["$mount"]=2592768 00:08:41.875 10:39:58 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:08:41.875 10:39:58 -- common/autotest_common.sh@352 -- # mounts["$mount"]=tmpfs 00:08:41.875 10:39:58 -- common/autotest_common.sh@352 -- # fss["$mount"]=tmpfs 00:08:41.875 10:39:58 -- common/autotest_common.sh@353 -- # avails["$mount"]=12342484992 00:08:41.875 10:39:58 -- common/autotest_common.sh@353 -- # sizes["$mount"]=12348465152 00:08:41.875 10:39:58 -- common/autotest_common.sh@354 -- # uses["$mount"]=5980160 00:08:41.875 10:39:58 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:08:41.875 10:39:58 -- common/autotest_common.sh@352 -- # mounts["$mount"]=tmpfs 00:08:41.875 10:39:58 -- common/autotest_common.sh@352 -- # fss["$mount"]=tmpfs 00:08:41.875 10:39:58 -- common/autotest_common.sh@353 -- # avails["$mount"]=30870585344 00:08:41.875 10:39:58 -- common/autotest_common.sh@353 -- # sizes["$mount"]=30871158784 00:08:41.875 10:39:58 -- common/autotest_common.sh@354 -- # uses["$mount"]=573440 00:08:41.875 10:39:58 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:08:41.875 10:39:58 -- common/autotest_common.sh@352 -- # mounts["$mount"]=tmpfs 00:08:41.875 10:39:58 -- common/autotest_common.sh@352 -- # fss["$mount"]=tmpfs 00:08:41.875 10:39:58 -- common/autotest_common.sh@353 -- # avails["$mount"]=6174224384 00:08:41.875 10:39:58 -- common/autotest_common.sh@353 -- # sizes["$mount"]=6174228480 00:08:41.875 10:39:58 -- common/autotest_common.sh@354 -- # uses["$mount"]=4096 00:08:41.875 10:39:58 -- common/autotest_common.sh@351 -- # read -r source fs size use avail _ mount 00:08:41.875 10:39:58 -- common/autotest_common.sh@357 -- # printf '* Looking for test storage...\n' 00:08:41.875 * Looking for test storage... 00:08:41.875 10:39:58 -- common/autotest_common.sh@359 -- # local target_space new_size 00:08:41.875 10:39:58 -- common/autotest_common.sh@360 -- # for target_dir in "${storage_candidates[@]}" 00:08:41.875 10:39:58 -- common/autotest_common.sh@363 -- # df /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:41.875 10:39:58 -- common/autotest_common.sh@363 -- # awk '$1 !~ /Filesystem/{print $6}' 00:08:41.875 10:39:58 -- common/autotest_common.sh@363 -- # mount=/ 00:08:41.875 10:39:58 -- common/autotest_common.sh@365 -- # target_space=52777320448 00:08:41.875 10:39:58 -- common/autotest_common.sh@366 -- # (( target_space == 0 || target_space < requested_size )) 00:08:41.875 10:39:58 -- common/autotest_common.sh@369 -- # (( target_space >= requested_size )) 00:08:41.875 10:39:58 -- common/autotest_common.sh@371 -- # [[ overlay == tmpfs ]] 00:08:41.875 10:39:58 -- common/autotest_common.sh@371 -- # [[ overlay == ramfs ]] 00:08:41.875 10:39:58 -- common/autotest_common.sh@371 -- # [[ / == / ]] 00:08:41.875 10:39:58 -- common/autotest_common.sh@372 -- # new_size=11179589632 00:08:41.875 10:39:58 -- common/autotest_common.sh@373 -- # (( new_size * 100 / sizes[/] > 95 )) 00:08:41.875 10:39:58 -- common/autotest_common.sh@378 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:41.875 10:39:58 -- common/autotest_common.sh@378 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:41.875 10:39:58 -- common/autotest_common.sh@379 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:41.875 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:41.875 10:39:58 -- common/autotest_common.sh@380 -- # return 0 00:08:41.875 10:39:58 -- common/autotest_common.sh@1667 -- # set -o errtrace 00:08:41.875 10:39:58 -- common/autotest_common.sh@1668 -- # shopt -s extdebug 00:08:41.875 10:39:58 -- common/autotest_common.sh@1669 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:08:41.875 10:39:58 -- common/autotest_common.sh@1671 -- # PS4=' \t -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:08:41.875 10:39:58 -- common/autotest_common.sh@1672 -- # true 00:08:41.875 10:39:58 -- common/autotest_common.sh@1674 -- # xtrace_fd 00:08:41.875 10:39:58 -- common/autotest_common.sh@25 -- # [[ -n 14 ]] 00:08:41.875 10:39:58 -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/14 ]] 00:08:41.875 10:39:58 -- common/autotest_common.sh@27 -- # exec 00:08:41.875 10:39:58 -- common/autotest_common.sh@29 -- # exec 00:08:41.875 10:39:58 -- common/autotest_common.sh@31 -- # xtrace_restore 00:08:41.875 10:39:58 -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:08:41.875 10:39:58 -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:08:41.875 10:39:58 -- common/autotest_common.sh@18 -- # set -x 00:08:41.875 10:39:58 -- vfio/run.sh@56 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/../common.sh 00:08:41.875 10:39:58 -- ../common.sh@8 -- # pids=() 00:08:41.875 10:39:58 -- vfio/run.sh@58 -- # fuzzfile=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c 00:08:41.875 10:39:58 -- vfio/run.sh@59 -- # grep -c '\.fn =' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c 00:08:41.875 10:39:58 -- vfio/run.sh@59 -- # fuzz_num=7 00:08:41.875 10:39:58 -- vfio/run.sh@60 -- # (( fuzz_num != 0 )) 00:08:41.875 10:39:58 -- vfio/run.sh@62 -- # trap 'cleanup /tmp/vfio-user-*; exit 1' SIGINT SIGTERM EXIT 00:08:41.875 10:39:58 -- vfio/run.sh@65 -- # mem_size=0 00:08:41.875 10:39:58 -- vfio/run.sh@66 -- # [[ 1 -eq 1 ]] 00:08:41.875 10:39:58 -- vfio/run.sh@67 -- # start_llvm_fuzz_short 7 1 00:08:41.875 10:39:58 -- ../common.sh@69 -- # local fuzz_num=7 00:08:41.875 10:39:58 -- ../common.sh@70 -- # local time=1 00:08:41.875 10:39:58 -- ../common.sh@72 -- # (( i = 0 )) 00:08:41.875 10:39:58 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:41.875 10:39:58 -- ../common.sh@73 -- # start_llvm_fuzz 0 1 0x1 00:08:41.876 10:39:58 -- vfio/run.sh@22 -- # local fuzzer_type=0 00:08:41.876 10:39:58 -- vfio/run.sh@23 -- # local timen=1 00:08:41.876 10:39:58 -- vfio/run.sh@24 -- # local core=0x1 00:08:41.876 10:39:58 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:08:41.876 10:39:58 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-0 00:08:41.876 10:39:58 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-0/domain/1 00:08:41.876 10:39:58 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-0/domain/2 00:08:41.876 10:39:58 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-0/fuzz_vfio_json.conf 00:08:41.876 10:39:58 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-0 /tmp/vfio-user-0/domain/1 /tmp/vfio-user-0/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:08:41.876 10:39:58 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-0/domain/1%; 00:08:41.876 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-0/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:41.876 10:39:58 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-0/domain/1 -c /tmp/vfio-user-0/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 -Y /tmp/vfio-user-0/domain/2 -r /tmp/vfio-user-0/spdk0.sock -Z 0 00:08:41.876 [2024-07-13 10:39:58.156760] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:08:41.876 [2024-07-13 10:39:58.156851] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2000682 ] 00:08:41.876 EAL: No free 2048 kB hugepages reported on node 1 00:08:41.876 [2024-07-13 10:39:58.229551] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:42.135 [2024-07-13 10:39:58.266008] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:42.135 [2024-07-13 10:39:58.266168] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:42.135 INFO: Running with entropic power schedule (0xFF, 100). 00:08:42.135 INFO: Seed: 331724066 00:08:42.135 INFO: Loaded 1 modules (338583 inline 8-bit counters): 338583 [0x266dfcc, 0x26c0a63), 00:08:42.135 INFO: Loaded 1 PC tables (338583 PCs): 338583 [0x26c0a68,0x2beb3d8), 00:08:42.135 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:08:42.135 INFO: A corpus is not provided, starting from an empty corpus 00:08:42.135 #2 INITED exec/s: 0 rss: 61Mb 00:08:42.135 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:42.135 This may also happen if the target rejected all inputs we tried so far 00:08:42.653 NEW_FUNC[1/631]: 0x49e0e0 in fuzz_vfio_user_region_rw /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:85 00:08:42.653 NEW_FUNC[2/631]: 0x4a3c80 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:42.653 #9 NEW cov: 10709 ft: 10677 corp: 2/61b lim: 60 exec/s: 0 rss: 66Mb L: 60/60 MS: 2 ChangeByte-InsertRepeatedBytes- 00:08:42.912 #10 NEW cov: 10726 ft: 13108 corp: 3/118b lim: 60 exec/s: 0 rss: 67Mb L: 57/60 MS: 1 EraseBytes- 00:08:43.171 NEW_FUNC[1/1]: 0x1948490 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:43.171 #11 NEW cov: 10743 ft: 13878 corp: 4/149b lim: 60 exec/s: 0 rss: 68Mb L: 31/60 MS: 1 EraseBytes- 00:08:43.171 #12 NEW cov: 10743 ft: 14362 corp: 5/207b lim: 60 exec/s: 12 rss: 69Mb L: 58/60 MS: 1 InsertByte- 00:08:43.430 #13 NEW cov: 10743 ft: 15041 corp: 6/265b lim: 60 exec/s: 13 rss: 69Mb L: 58/60 MS: 1 ChangeBinInt- 00:08:43.688 #14 NEW cov: 10743 ft: 15118 corp: 7/323b lim: 60 exec/s: 14 rss: 69Mb L: 58/60 MS: 1 ChangeByte- 00:08:43.947 #15 NEW cov: 10743 ft: 15340 corp: 8/383b lim: 60 exec/s: 15 rss: 69Mb L: 60/60 MS: 1 CrossOver- 00:08:44.206 #16 NEW cov: 10750 ft: 15609 corp: 9/427b lim: 60 exec/s: 16 rss: 69Mb L: 44/60 MS: 1 CrossOver- 00:08:44.206 #17 NEW cov: 10750 ft: 15838 corp: 10/484b lim: 60 exec/s: 8 rss: 69Mb L: 57/60 MS: 1 ChangeBit- 00:08:44.206 #17 DONE cov: 10750 ft: 15838 corp: 10/484b lim: 60 exec/s: 8 rss: 69Mb 00:08:44.206 Done 17 runs in 2 second(s) 00:08:44.464 10:40:00 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-0 00:08:44.464 10:40:00 -- ../common.sh@72 -- # (( i++ )) 00:08:44.464 10:40:00 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:44.464 10:40:00 -- ../common.sh@73 -- # start_llvm_fuzz 1 1 0x1 00:08:44.464 10:40:00 -- vfio/run.sh@22 -- # local fuzzer_type=1 00:08:44.464 10:40:00 -- vfio/run.sh@23 -- # local timen=1 00:08:44.464 10:40:00 -- vfio/run.sh@24 -- # local core=0x1 00:08:44.464 10:40:00 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:08:44.464 10:40:00 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-1 00:08:44.464 10:40:00 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-1/domain/1 00:08:44.464 10:40:00 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-1/domain/2 00:08:44.464 10:40:00 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-1/fuzz_vfio_json.conf 00:08:44.464 10:40:00 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-1 /tmp/vfio-user-1/domain/1 /tmp/vfio-user-1/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:08:44.464 10:40:00 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-1/domain/1%; 00:08:44.464 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-1/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:44.464 10:40:00 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-1/domain/1 -c /tmp/vfio-user-1/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 -Y /tmp/vfio-user-1/domain/2 -r /tmp/vfio-user-1/spdk1.sock -Z 1 00:08:44.723 [2024-07-13 10:40:00.855671] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:08:44.723 [2024-07-13 10:40:00.855752] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2001076 ] 00:08:44.723 EAL: No free 2048 kB hugepages reported on node 1 00:08:44.723 [2024-07-13 10:40:00.926746] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:44.723 [2024-07-13 10:40:00.963616] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:44.723 [2024-07-13 10:40:00.963757] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:44.983 INFO: Running with entropic power schedule (0xFF, 100). 00:08:44.983 INFO: Seed: 3030764532 00:08:44.983 INFO: Loaded 1 modules (338583 inline 8-bit counters): 338583 [0x266dfcc, 0x26c0a63), 00:08:44.983 INFO: Loaded 1 PC tables (338583 PCs): 338583 [0x26c0a68,0x2beb3d8), 00:08:44.983 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:08:44.983 INFO: A corpus is not provided, starting from an empty corpus 00:08:44.983 #2 INITED exec/s: 0 rss: 61Mb 00:08:44.983 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:44.983 This may also happen if the target rejected all inputs we tried so far 00:08:44.983 [2024-07-13 10:40:01.265478] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:44.983 [2024-07-13 10:40:01.265510] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:44.983 [2024-07-13 10:40:01.265529] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:45.501 NEW_FUNC[1/638]: 0x49e680 in fuzz_vfio_user_version /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:72 00:08:45.501 NEW_FUNC[2/638]: 0x4a3c80 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:45.501 #22 NEW cov: 10731 ft: 10671 corp: 2/17b lim: 40 exec/s: 0 rss: 66Mb L: 16/16 MS: 5 ShuffleBytes-ShuffleBytes-ShuffleBytes-ChangeBit-InsertRepeatedBytes- 00:08:45.501 [2024-07-13 10:40:01.737034] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:45.501 [2024-07-13 10:40:01.737068] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:45.501 [2024-07-13 10:40:01.737086] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:45.501 #28 NEW cov: 10745 ft: 13611 corp: 3/33b lim: 40 exec/s: 0 rss: 67Mb L: 16/16 MS: 1 ShuffleBytes- 00:08:45.760 [2024-07-13 10:40:01.944302] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:45.760 [2024-07-13 10:40:01.944324] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:45.760 [2024-07-13 10:40:01.944342] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:45.760 NEW_FUNC[1/1]: 0x1948490 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:45.760 #32 NEW cov: 10762 ft: 14171 corp: 4/37b lim: 40 exec/s: 0 rss: 68Mb L: 4/16 MS: 4 CopyPart-ChangeByte-InsertByte-CopyPart- 00:08:46.019 [2024-07-13 10:40:02.159834] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:46.019 [2024-07-13 10:40:02.159857] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:46.019 [2024-07-13 10:40:02.159874] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:46.019 #33 NEW cov: 10762 ft: 14389 corp: 5/42b lim: 40 exec/s: 33 rss: 68Mb L: 5/16 MS: 1 InsertRepeatedBytes- 00:08:46.019 [2024-07-13 10:40:02.367696] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:46.019 [2024-07-13 10:40:02.367718] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:46.019 [2024-07-13 10:40:02.367735] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:46.279 #34 NEW cov: 10762 ft: 14741 corp: 6/59b lim: 40 exec/s: 34 rss: 68Mb L: 17/17 MS: 1 InsertByte- 00:08:46.279 [2024-07-13 10:40:02.575011] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:46.279 [2024-07-13 10:40:02.575034] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:46.279 [2024-07-13 10:40:02.575057] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:46.539 #35 NEW cov: 10762 ft: 15109 corp: 7/76b lim: 40 exec/s: 35 rss: 68Mb L: 17/17 MS: 1 InsertByte- 00:08:46.539 [2024-07-13 10:40:02.784854] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:46.539 [2024-07-13 10:40:02.784877] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:46.539 [2024-07-13 10:40:02.784895] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:46.539 #36 NEW cov: 10762 ft: 15160 corp: 8/94b lim: 40 exec/s: 36 rss: 68Mb L: 18/18 MS: 1 InsertByte- 00:08:46.796 [2024-07-13 10:40:02.992954] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:46.796 [2024-07-13 10:40:02.992977] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:46.796 [2024-07-13 10:40:02.992994] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:46.796 #37 NEW cov: 10769 ft: 15227 corp: 9/126b lim: 40 exec/s: 37 rss: 68Mb L: 32/32 MS: 1 CrossOver- 00:08:47.054 [2024-07-13 10:40:03.199734] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:47.054 [2024-07-13 10:40:03.199755] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:47.054 [2024-07-13 10:40:03.199771] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:47.054 #38 NEW cov: 10769 ft: 15576 corp: 10/137b lim: 40 exec/s: 19 rss: 68Mb L: 11/32 MS: 1 EraseBytes- 00:08:47.054 #38 DONE cov: 10769 ft: 15576 corp: 10/137b lim: 40 exec/s: 19 rss: 68Mb 00:08:47.054 Done 38 runs in 2 second(s) 00:08:47.313 10:40:03 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-1 00:08:47.313 10:40:03 -- ../common.sh@72 -- # (( i++ )) 00:08:47.313 10:40:03 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:47.313 10:40:03 -- ../common.sh@73 -- # start_llvm_fuzz 2 1 0x1 00:08:47.313 10:40:03 -- vfio/run.sh@22 -- # local fuzzer_type=2 00:08:47.313 10:40:03 -- vfio/run.sh@23 -- # local timen=1 00:08:47.313 10:40:03 -- vfio/run.sh@24 -- # local core=0x1 00:08:47.313 10:40:03 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:08:47.313 10:40:03 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-2 00:08:47.313 10:40:03 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-2/domain/1 00:08:47.313 10:40:03 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-2/domain/2 00:08:47.313 10:40:03 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-2/fuzz_vfio_json.conf 00:08:47.313 10:40:03 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-2 /tmp/vfio-user-2/domain/1 /tmp/vfio-user-2/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:08:47.313 10:40:03 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-2/domain/1%; 00:08:47.313 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-2/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:47.313 10:40:03 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-2/domain/1 -c /tmp/vfio-user-2/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 -Y /tmp/vfio-user-2/domain/2 -r /tmp/vfio-user-2/spdk2.sock -Z 2 00:08:47.313 [2024-07-13 10:40:03.619918] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:08:47.313 [2024-07-13 10:40:03.620011] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2001625 ] 00:08:47.313 EAL: No free 2048 kB hugepages reported on node 1 00:08:47.313 [2024-07-13 10:40:03.692713] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:47.571 [2024-07-13 10:40:03.728434] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:47.571 [2024-07-13 10:40:03.728591] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:47.571 INFO: Running with entropic power schedule (0xFF, 100). 00:08:47.571 INFO: Seed: 1490801218 00:08:47.571 INFO: Loaded 1 modules (338583 inline 8-bit counters): 338583 [0x266dfcc, 0x26c0a63), 00:08:47.572 INFO: Loaded 1 PC tables (338583 PCs): 338583 [0x26c0a68,0x2beb3d8), 00:08:47.572 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:08:47.572 INFO: A corpus is not provided, starting from an empty corpus 00:08:47.572 #2 INITED exec/s: 0 rss: 61Mb 00:08:47.572 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:47.572 This may also happen if the target rejected all inputs we tried so far 00:08:47.829 [2024-07-13 10:40:03.987874] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:48.088 NEW_FUNC[1/636]: 0x49f060 in fuzz_vfio_user_get_region_info /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:104 00:08:48.088 NEW_FUNC[2/636]: 0x4a3c80 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:48.088 #6 NEW cov: 10707 ft: 10662 corp: 2/34b lim: 80 exec/s: 0 rss: 67Mb L: 33/33 MS: 4 CopyPart-ChangeBit-ShuffleBytes-InsertRepeatedBytes- 00:08:48.088 [2024-07-13 10:40:04.399007] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:48.347 #8 NEW cov: 10721 ft: 13624 corp: 3/108b lim: 80 exec/s: 0 rss: 68Mb L: 74/74 MS: 2 ShuffleBytes-InsertRepeatedBytes- 00:08:48.347 [2024-07-13 10:40:04.532515] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:48.347 #9 NEW cov: 10721 ft: 14958 corp: 4/139b lim: 80 exec/s: 0 rss: 69Mb L: 31/74 MS: 1 EraseBytes- 00:08:48.347 [2024-07-13 10:40:04.646568] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:48.347 #10 NEW cov: 10721 ft: 15560 corp: 5/169b lim: 80 exec/s: 0 rss: 69Mb L: 30/74 MS: 1 CrossOver- 00:08:48.605 [2024-07-13 10:40:04.760323] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:48.605 NEW_FUNC[1/1]: 0x1948490 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:48.605 #11 NEW cov: 10738 ft: 15766 corp: 6/200b lim: 80 exec/s: 0 rss: 69Mb L: 31/74 MS: 1 ChangeBit- 00:08:48.605 [2024-07-13 10:40:04.874014] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:48.605 #12 NEW cov: 10738 ft: 15897 corp: 7/231b lim: 80 exec/s: 12 rss: 69Mb L: 31/74 MS: 1 ChangeByte- 00:08:48.605 [2024-07-13 10:40:04.987843] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:48.864 #13 NEW cov: 10738 ft: 16052 corp: 8/293b lim: 80 exec/s: 13 rss: 69Mb L: 62/74 MS: 1 InsertRepeatedBytes- 00:08:48.864 [2024-07-13 10:40:05.101745] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:48.864 #14 NEW cov: 10738 ft: 16090 corp: 9/342b lim: 80 exec/s: 14 rss: 69Mb L: 49/74 MS: 1 CrossOver- 00:08:48.864 [2024-07-13 10:40:05.215891] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:49.124 #15 NEW cov: 10738 ft: 16170 corp: 10/373b lim: 80 exec/s: 15 rss: 70Mb L: 31/74 MS: 1 ChangeBit- 00:08:49.124 [2024-07-13 10:40:05.329531] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:49.124 #16 NEW cov: 10738 ft: 16329 corp: 11/444b lim: 80 exec/s: 16 rss: 70Mb L: 71/74 MS: 1 InsertRepeatedBytes- 00:08:49.124 [2024-07-13 10:40:05.443902] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:49.383 #17 NEW cov: 10738 ft: 16375 corp: 12/461b lim: 80 exec/s: 17 rss: 70Mb L: 17/74 MS: 1 CrossOver- 00:08:49.383 [2024-07-13 10:40:05.558125] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:49.383 #18 NEW cov: 10738 ft: 16614 corp: 13/525b lim: 80 exec/s: 18 rss: 70Mb L: 64/74 MS: 1 CopyPart- 00:08:49.383 [2024-07-13 10:40:05.682063] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:49.383 #19 NEW cov: 10745 ft: 16721 corp: 14/599b lim: 80 exec/s: 19 rss: 70Mb L: 74/74 MS: 1 ChangeByte- 00:08:49.642 [2024-07-13 10:40:05.795902] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:49.642 #20 NEW cov: 10745 ft: 16936 corp: 15/670b lim: 80 exec/s: 20 rss: 70Mb L: 71/74 MS: 1 CMP- DE: "(\000\000\000\000\000\000\000"- 00:08:49.642 [2024-07-13 10:40:05.909723] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:49.642 #21 NEW cov: 10745 ft: 16955 corp: 16/701b lim: 80 exec/s: 10 rss: 70Mb L: 31/74 MS: 1 ChangeBinInt- 00:08:49.642 #21 DONE cov: 10745 ft: 16955 corp: 16/701b lim: 80 exec/s: 10 rss: 70Mb 00:08:49.642 ###### Recommended dictionary. ###### 00:08:49.642 "(\000\000\000\000\000\000\000" # Uses: 0 00:08:49.642 ###### End of recommended dictionary. ###### 00:08:49.642 Done 21 runs in 2 second(s) 00:08:49.901 10:40:06 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-2 00:08:49.901 10:40:06 -- ../common.sh@72 -- # (( i++ )) 00:08:49.901 10:40:06 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:49.901 10:40:06 -- ../common.sh@73 -- # start_llvm_fuzz 3 1 0x1 00:08:49.901 10:40:06 -- vfio/run.sh@22 -- # local fuzzer_type=3 00:08:49.901 10:40:06 -- vfio/run.sh@23 -- # local timen=1 00:08:49.901 10:40:06 -- vfio/run.sh@24 -- # local core=0x1 00:08:49.901 10:40:06 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:08:49.901 10:40:06 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-3 00:08:49.901 10:40:06 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-3/domain/1 00:08:49.901 10:40:06 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-3/domain/2 00:08:49.901 10:40:06 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-3/fuzz_vfio_json.conf 00:08:49.901 10:40:06 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-3 /tmp/vfio-user-3/domain/1 /tmp/vfio-user-3/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:08:49.901 10:40:06 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-3/domain/1%; 00:08:49.901 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-3/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:49.901 10:40:06 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-3/domain/1 -c /tmp/vfio-user-3/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 -Y /tmp/vfio-user-3/domain/2 -r /tmp/vfio-user-3/spdk3.sock -Z 3 00:08:49.901 [2024-07-13 10:40:06.272233] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:08:49.901 [2024-07-13 10:40:06.272301] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2002163 ] 00:08:50.160 EAL: No free 2048 kB hugepages reported on node 1 00:08:50.160 [2024-07-13 10:40:06.342683] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:50.160 [2024-07-13 10:40:06.378339] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:50.160 [2024-07-13 10:40:06.378493] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:50.419 INFO: Running with entropic power schedule (0xFF, 100). 00:08:50.419 INFO: Seed: 4140760134 00:08:50.419 INFO: Loaded 1 modules (338583 inline 8-bit counters): 338583 [0x266dfcc, 0x26c0a63), 00:08:50.419 INFO: Loaded 1 PC tables (338583 PCs): 338583 [0x26c0a68,0x2beb3d8), 00:08:50.419 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:08:50.419 INFO: A corpus is not provided, starting from an empty corpus 00:08:50.419 #2 INITED exec/s: 0 rss: 61Mb 00:08:50.419 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:50.419 This may also happen if the target rejected all inputs we tried so far 00:08:50.678 NEW_FUNC[1/632]: 0x49f740 in fuzz_vfio_user_dma_map /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:125 00:08:50.678 NEW_FUNC[2/632]: 0x4a3c80 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:50.678 #27 NEW cov: 10700 ft: 10427 corp: 2/109b lim: 320 exec/s: 0 rss: 66Mb L: 108/108 MS: 5 InsertRepeatedBytes-CrossOver-CrossOver-ChangeByte-InsertRepeatedBytes- 00:08:50.938 #45 NEW cov: 10717 ft: 13201 corp: 3/197b lim: 320 exec/s: 0 rss: 68Mb L: 88/108 MS: 3 InsertByte-EraseBytes-InsertRepeatedBytes- 00:08:51.198 NEW_FUNC[1/1]: 0x1948490 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:51.198 #46 NEW cov: 10734 ft: 14138 corp: 4/305b lim: 320 exec/s: 0 rss: 69Mb L: 108/108 MS: 1 ShuffleBytes- 00:08:51.456 #47 NEW cov: 10734 ft: 15144 corp: 5/413b lim: 320 exec/s: 47 rss: 69Mb L: 108/108 MS: 1 ChangeBinInt- 00:08:51.715 #48 NEW cov: 10734 ft: 15693 corp: 6/528b lim: 320 exec/s: 48 rss: 69Mb L: 115/115 MS: 1 CopyPart- 00:08:51.715 #49 NEW cov: 10734 ft: 15902 corp: 7/743b lim: 320 exec/s: 49 rss: 69Mb L: 215/215 MS: 1 CopyPart- 00:08:51.975 #52 NEW cov: 10734 ft: 16140 corp: 8/829b lim: 320 exec/s: 52 rss: 69Mb L: 86/215 MS: 3 InsertByte-CrossOver-CrossOver- 00:08:52.233 #53 NEW cov: 10741 ft: 16375 corp: 9/1044b lim: 320 exec/s: 53 rss: 69Mb L: 215/215 MS: 1 ChangeByte- 00:08:52.493 #54 NEW cov: 10741 ft: 16415 corp: 10/1153b lim: 320 exec/s: 27 rss: 69Mb L: 109/215 MS: 1 InsertByte- 00:08:52.493 #54 DONE cov: 10741 ft: 16415 corp: 10/1153b lim: 320 exec/s: 27 rss: 69Mb 00:08:52.493 Done 54 runs in 2 second(s) 00:08:52.752 10:40:08 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-3 00:08:52.752 10:40:08 -- ../common.sh@72 -- # (( i++ )) 00:08:52.752 10:40:08 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:52.752 10:40:08 -- ../common.sh@73 -- # start_llvm_fuzz 4 1 0x1 00:08:52.752 10:40:08 -- vfio/run.sh@22 -- # local fuzzer_type=4 00:08:52.752 10:40:08 -- vfio/run.sh@23 -- # local timen=1 00:08:52.752 10:40:08 -- vfio/run.sh@24 -- # local core=0x1 00:08:52.752 10:40:08 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:08:52.752 10:40:08 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-4 00:08:52.752 10:40:08 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-4/domain/1 00:08:52.752 10:40:08 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-4/domain/2 00:08:52.752 10:40:08 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-4/fuzz_vfio_json.conf 00:08:52.752 10:40:08 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-4 /tmp/vfio-user-4/domain/1 /tmp/vfio-user-4/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:08:52.752 10:40:08 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-4/domain/1%; 00:08:52.752 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-4/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:52.752 10:40:08 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-4/domain/1 -c /tmp/vfio-user-4/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 -Y /tmp/vfio-user-4/domain/2 -r /tmp/vfio-user-4/spdk4.sock -Z 4 00:08:52.752 [2024-07-13 10:40:09.021811] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:08:52.752 [2024-07-13 10:40:09.021883] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2002634 ] 00:08:52.752 EAL: No free 2048 kB hugepages reported on node 1 00:08:52.752 [2024-07-13 10:40:09.094609] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:52.752 [2024-07-13 10:40:09.131321] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:52.752 [2024-07-13 10:40:09.131482] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:53.013 INFO: Running with entropic power schedule (0xFF, 100). 00:08:53.013 INFO: Seed: 2603852341 00:08:53.013 INFO: Loaded 1 modules (338583 inline 8-bit counters): 338583 [0x266dfcc, 0x26c0a63), 00:08:53.013 INFO: Loaded 1 PC tables (338583 PCs): 338583 [0x26c0a68,0x2beb3d8), 00:08:53.013 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:08:53.013 INFO: A corpus is not provided, starting from an empty corpus 00:08:53.013 #2 INITED exec/s: 0 rss: 61Mb 00:08:53.013 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:53.013 This may also happen if the target rejected all inputs we tried so far 00:08:53.295 [2024-07-13 10:40:09.408493] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to memory map DMA region [(nil), (nil)) fd=323 offset=0 prot=0x3: Invalid argument 00:08:53.295 [2024-07-13 10:40:09.408530] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to add DMA region [0, 0) offset=0 flags=0x3: Invalid argument 00:08:53.295 [2024-07-13 10:40:09.408541] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 2 failed: Invalid argument 00:08:53.295 [2024-07-13 10:40:09.408563] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:08:53.295 [2024-07-13 10:40:09.409486] vfio_user.c:3094:vfio_user_log: *WARNING*: /tmp/vfio-user-4/domain/1: failed to remove DMA region [0, 0) flags=0: No such file or directory 00:08:53.295 [2024-07-13 10:40:09.409498] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 3 failed: No such file or directory 00:08:53.295 [2024-07-13 10:40:09.409514] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 3 return failure 00:08:53.560 NEW_FUNC[1/637]: 0x49ffc0 in fuzz_vfio_user_dma_unmap /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:145 00:08:53.560 NEW_FUNC[2/637]: 0x4a3c80 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:53.560 #5 NEW cov: 10728 ft: 10688 corp: 2/37b lim: 320 exec/s: 0 rss: 67Mb L: 36/36 MS: 3 ChangeBit-ChangeBit-InsertRepeatedBytes- 00:08:53.560 [2024-07-13 10:40:09.907384] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to memory map DMA region [(nil), (nil)) fd=325 offset=0 prot=0x3: Invalid argument 00:08:53.560 [2024-07-13 10:40:09.907417] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to add DMA region [0, 0) offset=0 flags=0x3: Invalid argument 00:08:53.560 [2024-07-13 10:40:09.907428] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 2 failed: Invalid argument 00:08:53.560 [2024-07-13 10:40:09.907467] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:08:53.561 [2024-07-13 10:40:09.908378] vfio_user.c:3094:vfio_user_log: *WARNING*: /tmp/vfio-user-4/domain/1: failed to remove DMA region [0, 0) flags=0: No such file or directory 00:08:53.561 [2024-07-13 10:40:09.908396] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 3 failed: No such file or directory 00:08:53.561 [2024-07-13 10:40:09.908412] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 3 return failure 00:08:53.823 #6 NEW cov: 10742 ft: 13250 corp: 3/73b lim: 320 exec/s: 0 rss: 68Mb L: 36/36 MS: 1 ShuffleBytes- 00:08:53.823 [2024-07-13 10:40:10.113819] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to memory map DMA region [(nil), (nil)) fd=325 offset=0x800 prot=0x3: Invalid argument 00:08:53.823 [2024-07-13 10:40:10.113847] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to add DMA region [0, 0) offset=0x800 flags=0x3: Invalid argument 00:08:53.823 [2024-07-13 10:40:10.113869] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 2 failed: Invalid argument 00:08:53.823 [2024-07-13 10:40:10.113902] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:08:53.823 [2024-07-13 10:40:10.114816] vfio_user.c:3094:vfio_user_log: *WARNING*: /tmp/vfio-user-4/domain/1: failed to remove DMA region [0, 0) flags=0: No such file or directory 00:08:53.823 [2024-07-13 10:40:10.114834] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 3 failed: No such file or directory 00:08:53.823 [2024-07-13 10:40:10.114851] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 3 return failure 00:08:54.081 NEW_FUNC[1/1]: 0x1948490 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:54.081 #7 NEW cov: 10759 ft: 14608 corp: 4/109b lim: 320 exec/s: 0 rss: 69Mb L: 36/36 MS: 1 ChangeBit- 00:08:54.081 [2024-07-13 10:40:10.322579] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to memory map DMA region [(nil), (nil)) fd=325 offset=0 prot=0x3: Invalid argument 00:08:54.081 [2024-07-13 10:40:10.322604] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to add DMA region [0, 0) offset=0 flags=0x3: Invalid argument 00:08:54.081 [2024-07-13 10:40:10.322615] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 2 failed: Invalid argument 00:08:54.081 [2024-07-13 10:40:10.322647] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:08:54.081 [2024-07-13 10:40:10.323573] vfio_user.c:3094:vfio_user_log: *WARNING*: /tmp/vfio-user-4/domain/1: failed to remove DMA region [0, 0) flags=0: No such file or directory 00:08:54.081 [2024-07-13 10:40:10.323592] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 3 failed: No such file or directory 00:08:54.081 [2024-07-13 10:40:10.323609] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 3 return failure 00:08:54.081 #8 NEW cov: 10759 ft: 15063 corp: 5/145b lim: 320 exec/s: 8 rss: 69Mb L: 36/36 MS: 1 ChangeBinInt- 00:08:54.340 [2024-07-13 10:40:10.529341] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to memory map DMA region [0x10000000, 0x10000000) fd=325 offset=0 prot=0x3: Invalid argument 00:08:54.340 [2024-07-13 10:40:10.529365] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to add DMA region [0x10000000, 0x10000000) offset=0 flags=0x3: Invalid argument 00:08:54.340 [2024-07-13 10:40:10.529376] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 2 failed: Invalid argument 00:08:54.340 [2024-07-13 10:40:10.529393] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:08:54.340 [2024-07-13 10:40:10.530369] vfio_user.c:3094:vfio_user_log: *WARNING*: /tmp/vfio-user-4/domain/1: failed to remove DMA region [0x10000000, 0x10000000) flags=0: No such file or directory 00:08:54.340 [2024-07-13 10:40:10.530388] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 3 failed: No such file or directory 00:08:54.340 [2024-07-13 10:40:10.530404] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 3 return failure 00:08:54.340 #9 NEW cov: 10759 ft: 15322 corp: 6/181b lim: 320 exec/s: 9 rss: 69Mb L: 36/36 MS: 1 ChangeBit- 00:08:54.599 [2024-07-13 10:40:10.736904] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: DMA region size 15770157678700714714 > max 8796093022208 00:08:54.599 [2024-07-13 10:40:10.736929] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to add DMA region [0xda00000010000000, 0xb4dadadaeadadada) offset=0xdadadadadadadada flags=0x3: No space left on device 00:08:54.599 [2024-07-13 10:40:10.736940] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 2 failed: No space left on device 00:08:54.599 [2024-07-13 10:40:10.736956] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:08:54.599 [2024-07-13 10:40:10.737918] vfio_user.c:3094:vfio_user_log: *WARNING*: /tmp/vfio-user-4/domain/1: failed to remove DMA region [0xda00000010000000, 0xb4dadadaeadadada) flags=0: No such file or directory 00:08:54.599 [2024-07-13 10:40:10.737936] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 3 failed: No such file or directory 00:08:54.599 [2024-07-13 10:40:10.737953] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 3 return failure 00:08:54.599 #10 NEW cov: 10759 ft: 15369 corp: 7/269b lim: 320 exec/s: 10 rss: 69Mb L: 88/88 MS: 1 InsertRepeatedBytes- 00:08:54.599 [2024-07-13 10:40:10.942425] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to memory map DMA region [0x2400, 0x2400) fd=325 offset=0 prot=0x3: Invalid argument 00:08:54.599 [2024-07-13 10:40:10.942454] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to add DMA region [0x2400, 0x2400) offset=0 flags=0x3: Invalid argument 00:08:54.599 [2024-07-13 10:40:10.942464] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 2 failed: Invalid argument 00:08:54.599 [2024-07-13 10:40:10.942497] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:08:54.599 [2024-07-13 10:40:10.943438] vfio_user.c:3094:vfio_user_log: *WARNING*: /tmp/vfio-user-4/domain/1: failed to remove DMA region [0x2400, 0x2400) flags=0: No such file or directory 00:08:54.599 [2024-07-13 10:40:10.943462] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 3 failed: No such file or directory 00:08:54.599 [2024-07-13 10:40:10.943494] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 3 return failure 00:08:54.859 #11 NEW cov: 10759 ft: 15659 corp: 8/306b lim: 320 exec/s: 11 rss: 69Mb L: 37/88 MS: 1 InsertByte- 00:08:54.859 [2024-07-13 10:40:11.146040] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: DMA region size 15770157678700714714 > max 8796093022208 00:08:54.859 [2024-07-13 10:40:11.146072] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to add DMA region [0xda00000010000000, 0xb4dadadaeadadada) offset=0xdadadadadadadada flags=0x3: No space left on device 00:08:54.859 [2024-07-13 10:40:11.146086] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 2 failed: No space left on device 00:08:54.859 [2024-07-13 10:40:11.146103] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:08:54.859 [2024-07-13 10:40:11.147066] vfio_user.c:3094:vfio_user_log: *WARNING*: /tmp/vfio-user-4/domain/1: failed to remove DMA region [0xda00000010000000, 0xb4dadadaeadadada) flags=0: No such file or directory 00:08:54.859 [2024-07-13 10:40:11.147085] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 3 failed: No such file or directory 00:08:54.859 [2024-07-13 10:40:11.147102] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 3 return failure 00:08:55.119 #12 NEW cov: 10766 ft: 15698 corp: 9/395b lim: 320 exec/s: 12 rss: 69Mb L: 89/89 MS: 1 InsertByte- 00:08:55.119 [2024-07-13 10:40:11.351268] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: DMA region size 15770157678700714714 > max 8796093022208 00:08:55.119 [2024-07-13 10:40:11.351291] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to add DMA region [0xdadadadadadada00, 0xb5b5b5b5b5b5b4da) offset=0xdadadadadadadada flags=0x3: No space left on device 00:08:55.119 [2024-07-13 10:40:11.351303] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 2 failed: No space left on device 00:08:55.119 [2024-07-13 10:40:11.351336] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:08:55.119 [2024-07-13 10:40:11.352281] vfio_user.c:3094:vfio_user_log: *WARNING*: /tmp/vfio-user-4/domain/1: failed to remove DMA region [0xdadadadadadada00, 0xb5b5b5b5b5b5b4da) flags=0: No such file or directory 00:08:55.119 [2024-07-13 10:40:11.352300] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 3 failed: No such file or directory 00:08:55.119 [2024-07-13 10:40:11.352316] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 3 return failure 00:08:55.119 #13 NEW cov: 10766 ft: 15715 corp: 10/478b lim: 320 exec/s: 6 rss: 69Mb L: 83/89 MS: 1 EraseBytes- 00:08:55.119 #13 DONE cov: 10766 ft: 15715 corp: 10/478b lim: 320 exec/s: 6 rss: 69Mb 00:08:55.119 Done 13 runs in 2 second(s) 00:08:55.378 10:40:11 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-4 00:08:55.378 10:40:11 -- ../common.sh@72 -- # (( i++ )) 00:08:55.378 10:40:11 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:55.378 10:40:11 -- ../common.sh@73 -- # start_llvm_fuzz 5 1 0x1 00:08:55.378 10:40:11 -- vfio/run.sh@22 -- # local fuzzer_type=5 00:08:55.378 10:40:11 -- vfio/run.sh@23 -- # local timen=1 00:08:55.378 10:40:11 -- vfio/run.sh@24 -- # local core=0x1 00:08:55.378 10:40:11 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:08:55.378 10:40:11 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-5 00:08:55.378 10:40:11 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-5/domain/1 00:08:55.378 10:40:11 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-5/domain/2 00:08:55.378 10:40:11 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-5/fuzz_vfio_json.conf 00:08:55.378 10:40:11 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-5 /tmp/vfio-user-5/domain/1 /tmp/vfio-user-5/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:08:55.378 10:40:11 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-5/domain/1%; 00:08:55.378 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-5/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:55.378 10:40:11 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-5/domain/1 -c /tmp/vfio-user-5/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 -Y /tmp/vfio-user-5/domain/2 -r /tmp/vfio-user-5/spdk5.sock -Z 5 00:08:55.637 [2024-07-13 10:40:11.769993] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:08:55.637 [2024-07-13 10:40:11.770087] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2003020 ] 00:08:55.637 EAL: No free 2048 kB hugepages reported on node 1 00:08:55.637 [2024-07-13 10:40:11.843268] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:55.637 [2024-07-13 10:40:11.879452] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:55.637 [2024-07-13 10:40:11.879604] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:55.896 INFO: Running with entropic power schedule (0xFF, 100). 00:08:55.896 INFO: Seed: 1056844084 00:08:55.896 INFO: Loaded 1 modules (338583 inline 8-bit counters): 338583 [0x266dfcc, 0x26c0a63), 00:08:55.896 INFO: Loaded 1 PC tables (338583 PCs): 338583 [0x26c0a68,0x2beb3d8), 00:08:55.896 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:08:55.896 INFO: A corpus is not provided, starting from an empty corpus 00:08:55.896 #2 INITED exec/s: 0 rss: 61Mb 00:08:55.896 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:55.896 This may also happen if the target rejected all inputs we tried so far 00:08:55.896 [2024-07-13 10:40:12.172480] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:55.896 [2024-07-13 10:40:12.172527] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:56.414 NEW_FUNC[1/638]: 0x4a09c0 in fuzz_vfio_user_irq_set /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:172 00:08:56.414 NEW_FUNC[2/638]: 0x4a3c80 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:56.414 #15 NEW cov: 10730 ft: 10381 corp: 2/47b lim: 120 exec/s: 0 rss: 66Mb L: 46/46 MS: 3 ChangeByte-CopyPart-InsertRepeatedBytes- 00:08:56.414 [2024-07-13 10:40:12.645649] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:56.414 [2024-07-13 10:40:12.645693] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:56.414 #16 NEW cov: 10747 ft: 13193 corp: 3/95b lim: 120 exec/s: 0 rss: 67Mb L: 48/48 MS: 1 CMP- DE: "\020\000"- 00:08:56.672 [2024-07-13 10:40:12.844853] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:56.672 [2024-07-13 10:40:12.844884] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:56.672 NEW_FUNC[1/1]: 0x1948490 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:56.672 #19 NEW cov: 10764 ft: 13820 corp: 4/169b lim: 120 exec/s: 0 rss: 68Mb L: 74/74 MS: 3 CopyPart-ChangeBit-InsertRepeatedBytes- 00:08:56.672 [2024-07-13 10:40:13.051875] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:56.672 [2024-07-13 10:40:13.051904] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:56.931 #20 NEW cov: 10764 ft: 14402 corp: 5/272b lim: 120 exec/s: 20 rss: 68Mb L: 103/103 MS: 1 InsertRepeatedBytes- 00:08:56.931 [2024-07-13 10:40:13.249433] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:56.931 [2024-07-13 10:40:13.249470] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:57.189 #21 NEW cov: 10764 ft: 14566 corp: 6/320b lim: 120 exec/s: 21 rss: 68Mb L: 48/103 MS: 1 ChangeByte- 00:08:57.189 [2024-07-13 10:40:13.447529] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:57.189 [2024-07-13 10:40:13.447559] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:57.189 #22 NEW cov: 10764 ft: 15106 corp: 7/396b lim: 120 exec/s: 22 rss: 68Mb L: 76/103 MS: 1 PersAutoDict- DE: "\020\000"- 00:08:57.448 [2024-07-13 10:40:13.644509] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:57.448 [2024-07-13 10:40:13.644540] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:57.448 #23 NEW cov: 10764 ft: 15613 corp: 8/514b lim: 120 exec/s: 23 rss: 68Mb L: 118/118 MS: 1 InsertRepeatedBytes- 00:08:57.707 [2024-07-13 10:40:13.843218] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:57.707 [2024-07-13 10:40:13.843253] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:57.707 #24 NEW cov: 10771 ft: 15924 corp: 9/617b lim: 120 exec/s: 24 rss: 68Mb L: 103/118 MS: 1 ShuffleBytes- 00:08:57.707 [2024-07-13 10:40:14.038260] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:57.707 [2024-07-13 10:40:14.038290] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:57.965 #25 NEW cov: 10771 ft: 15981 corp: 10/720b lim: 120 exec/s: 12 rss: 68Mb L: 103/118 MS: 1 CopyPart- 00:08:57.965 #25 DONE cov: 10771 ft: 15981 corp: 10/720b lim: 120 exec/s: 12 rss: 68Mb 00:08:57.965 ###### Recommended dictionary. ###### 00:08:57.965 "\020\000" # Uses: 1 00:08:57.965 ###### End of recommended dictionary. ###### 00:08:57.965 Done 25 runs in 2 second(s) 00:08:58.224 10:40:14 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-5 00:08:58.224 10:40:14 -- ../common.sh@72 -- # (( i++ )) 00:08:58.224 10:40:14 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:58.224 10:40:14 -- ../common.sh@73 -- # start_llvm_fuzz 6 1 0x1 00:08:58.224 10:40:14 -- vfio/run.sh@22 -- # local fuzzer_type=6 00:08:58.224 10:40:14 -- vfio/run.sh@23 -- # local timen=1 00:08:58.224 10:40:14 -- vfio/run.sh@24 -- # local core=0x1 00:08:58.224 10:40:14 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:08:58.224 10:40:14 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-6 00:08:58.224 10:40:14 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-6/domain/1 00:08:58.224 10:40:14 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-6/domain/2 00:08:58.224 10:40:14 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-6/fuzz_vfio_json.conf 00:08:58.224 10:40:14 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-6 /tmp/vfio-user-6/domain/1 /tmp/vfio-user-6/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:08:58.224 10:40:14 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-6/domain/1%; 00:08:58.224 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-6/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:58.224 10:40:14 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-6/domain/1 -c /tmp/vfio-user-6/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 -Y /tmp/vfio-user-6/domain/2 -r /tmp/vfio-user-6/spdk6.sock -Z 6 00:08:58.224 [2024-07-13 10:40:14.441531] Starting SPDK v24.01.1-pre git sha1 4b94202c6 / DPDK 23.11.0 initialization... 00:08:58.224 [2024-07-13 10:40:14.441600] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2003549 ] 00:08:58.224 EAL: No free 2048 kB hugepages reported on node 1 00:08:58.224 [2024-07-13 10:40:14.512163] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:58.224 [2024-07-13 10:40:14.547904] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:58.224 [2024-07-13 10:40:14.548056] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:58.482 INFO: Running with entropic power schedule (0xFF, 100). 00:08:58.482 INFO: Seed: 3719830759 00:08:58.482 INFO: Loaded 1 modules (338583 inline 8-bit counters): 338583 [0x266dfcc, 0x26c0a63), 00:08:58.482 INFO: Loaded 1 PC tables (338583 PCs): 338583 [0x26c0a68,0x2beb3d8), 00:08:58.482 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:08:58.482 INFO: A corpus is not provided, starting from an empty corpus 00:08:58.482 #2 INITED exec/s: 0 rss: 60Mb 00:08:58.482 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:58.482 This may also happen if the target rejected all inputs we tried so far 00:08:58.482 [2024-07-13 10:40:14.836472] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:58.482 [2024-07-13 10:40:14.836523] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:58.999 NEW_FUNC[1/638]: 0x4a16b0 in fuzz_vfio_user_set_msix /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:190 00:08:58.999 NEW_FUNC[2/638]: 0x4a3c80 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:58.999 #5 NEW cov: 10721 ft: 10582 corp: 2/59b lim: 90 exec/s: 0 rss: 66Mb L: 58/58 MS: 3 ShuffleBytes-CrossOver-InsertRepeatedBytes- 00:08:58.999 [2024-07-13 10:40:15.337593] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:58.999 [2024-07-13 10:40:15.337635] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:59.258 #8 NEW cov: 10735 ft: 12691 corp: 3/93b lim: 90 exec/s: 0 rss: 68Mb L: 34/58 MS: 3 CMP-InsertByte-CrossOver- DE: "\001\000\000\020"- 00:08:59.258 [2024-07-13 10:40:15.546304] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:59.258 [2024-07-13 10:40:15.546333] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:59.517 NEW_FUNC[1/1]: 0x1948490 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:59.517 #9 NEW cov: 10752 ft: 14606 corp: 4/128b lim: 90 exec/s: 0 rss: 69Mb L: 35/58 MS: 1 EraseBytes- 00:08:59.517 [2024-07-13 10:40:15.755703] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:59.517 [2024-07-13 10:40:15.755733] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:59.517 #10 NEW cov: 10752 ft: 14794 corp: 5/167b lim: 90 exec/s: 10 rss: 69Mb L: 39/58 MS: 1 CMP- DE: "\015\000\000\000"- 00:08:59.776 [2024-07-13 10:40:15.964891] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:59.776 [2024-07-13 10:40:15.964922] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:59.776 #11 NEW cov: 10752 ft: 15335 corp: 6/219b lim: 90 exec/s: 11 rss: 70Mb L: 52/58 MS: 1 InsertRepeatedBytes- 00:09:00.035 [2024-07-13 10:40:16.171702] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:00.035 [2024-07-13 10:40:16.171732] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:00.035 #12 NEW cov: 10752 ft: 15425 corp: 7/258b lim: 90 exec/s: 12 rss: 70Mb L: 39/58 MS: 1 ChangeBit- 00:09:00.035 [2024-07-13 10:40:16.378116] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:00.035 [2024-07-13 10:40:16.378147] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:00.294 #13 NEW cov: 10752 ft: 15470 corp: 8/292b lim: 90 exec/s: 13 rss: 70Mb L: 34/58 MS: 1 CMP- DE: "\000\000\000\000\005}\261\020"- 00:09:00.294 [2024-07-13 10:40:16.586162] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:00.294 [2024-07-13 10:40:16.586193] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:00.554 #14 NEW cov: 10759 ft: 15700 corp: 9/331b lim: 90 exec/s: 14 rss: 70Mb L: 39/58 MS: 1 PersAutoDict- DE: "\000\000\000\000\005}\261\020"- 00:09:00.554 [2024-07-13 10:40:16.792547] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:00.554 [2024-07-13 10:40:16.792577] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:00.554 #15 NEW cov: 10759 ft: 16141 corp: 10/399b lim: 90 exec/s: 7 rss: 70Mb L: 68/68 MS: 1 CopyPart- 00:09:00.554 #15 DONE cov: 10759 ft: 16141 corp: 10/399b lim: 90 exec/s: 7 rss: 70Mb 00:09:00.554 ###### Recommended dictionary. ###### 00:09:00.554 "\001\000\000\020" # Uses: 0 00:09:00.554 "\015\000\000\000" # Uses: 0 00:09:00.554 "\000\000\000\000\005}\261\020" # Uses: 1 00:09:00.554 ###### End of recommended dictionary. ###### 00:09:00.554 Done 15 runs in 2 second(s) 00:09:00.813 10:40:17 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-6 00:09:00.813 10:40:17 -- ../common.sh@72 -- # (( i++ )) 00:09:00.813 10:40:17 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:00.813 10:40:17 -- vfio/run.sh@75 -- # trap - SIGINT SIGTERM EXIT 00:09:00.813 00:09:00.813 real 0m19.273s 00:09:00.813 user 0m27.430s 00:09:00.813 sys 0m1.755s 00:09:00.813 10:40:17 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:00.813 10:40:17 -- common/autotest_common.sh@10 -- # set +x 00:09:00.813 ************************************ 00:09:00.813 END TEST vfio_fuzz 00:09:00.813 ************************************ 00:09:01.072 10:40:17 -- fuzz/llvm.sh@67 -- # [[ 1 -eq 0 ]] 00:09:01.072 00:09:01.072 real 1m21.404s 00:09:01.072 user 2m6.258s 00:09:01.072 sys 0m8.491s 00:09:01.072 10:40:17 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:01.072 10:40:17 -- common/autotest_common.sh@10 -- # set +x 00:09:01.072 ************************************ 00:09:01.072 END TEST llvm_fuzz 00:09:01.072 ************************************ 00:09:01.072 10:40:17 -- spdk/autotest.sh@378 -- # [[ 0 -eq 1 ]] 00:09:01.072 10:40:17 -- spdk/autotest.sh@383 -- # trap - SIGINT SIGTERM EXIT 00:09:01.072 10:40:17 -- spdk/autotest.sh@385 -- # timing_enter post_cleanup 00:09:01.072 10:40:17 -- common/autotest_common.sh@712 -- # xtrace_disable 00:09:01.072 10:40:17 -- common/autotest_common.sh@10 -- # set +x 00:09:01.072 10:40:17 -- spdk/autotest.sh@386 -- # autotest_cleanup 00:09:01.072 10:40:17 -- common/autotest_common.sh@1371 -- # local autotest_es=0 00:09:01.072 10:40:17 -- common/autotest_common.sh@1372 -- # xtrace_disable 00:09:01.072 10:40:17 -- common/autotest_common.sh@10 -- # set +x 00:09:07.655 INFO: APP EXITING 00:09:07.655 INFO: killing all VMs 00:09:07.655 INFO: killing vhost app 00:09:07.655 INFO: EXIT DONE 00:09:09.562 Waiting for block devices as requested 00:09:09.562 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:09:09.822 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:09:09.822 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:09:09.822 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:09:10.081 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:09:10.081 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:09:10.081 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:09:10.081 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:09:10.341 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:09:10.341 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:09:10.341 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:09:10.600 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:09:10.600 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:09:10.600 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:09:10.859 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:09:10.859 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:09:10.859 0000:d8:00.0 (8086 0a54): vfio-pci -> nvme 00:09:15.055 Cleaning 00:09:15.055 Removing: /dev/shm/spdk_tgt_trace.pid1967059 00:09:15.055 Removing: /var/run/dpdk/spdk_pid1964596 00:09:15.055 Removing: /var/run/dpdk/spdk_pid1965857 00:09:15.055 Removing: /var/run/dpdk/spdk_pid1967059 00:09:15.055 Removing: /var/run/dpdk/spdk_pid1967716 00:09:15.055 Removing: /var/run/dpdk/spdk_pid1967991 00:09:15.055 Removing: /var/run/dpdk/spdk_pid1968311 00:09:15.055 Removing: /var/run/dpdk/spdk_pid1968639 00:09:15.055 Removing: /var/run/dpdk/spdk_pid1968907 00:09:15.055 Removing: /var/run/dpdk/spdk_pid1969175 00:09:15.055 Removing: /var/run/dpdk/spdk_pid1969459 00:09:15.055 Removing: /var/run/dpdk/spdk_pid1969766 00:09:15.055 Removing: /var/run/dpdk/spdk_pid1970441 00:09:15.055 Removing: /var/run/dpdk/spdk_pid1973563 00:09:15.055 Removing: /var/run/dpdk/spdk_pid1973866 00:09:15.055 Removing: /var/run/dpdk/spdk_pid1974246 00:09:15.055 Removing: /var/run/dpdk/spdk_pid1974443 00:09:15.055 Removing: /var/run/dpdk/spdk_pid1975005 00:09:15.055 Removing: /var/run/dpdk/spdk_pid1975027 00:09:15.055 Removing: /var/run/dpdk/spdk_pid1975594 00:09:15.055 Removing: /var/run/dpdk/spdk_pid1975785 00:09:15.055 Removing: /var/run/dpdk/spdk_pid1976156 00:09:15.055 Removing: /var/run/dpdk/spdk_pid1976180 00:09:15.055 Removing: /var/run/dpdk/spdk_pid1976468 00:09:15.055 Removing: /var/run/dpdk/spdk_pid1976587 00:09:15.055 Removing: /var/run/dpdk/spdk_pid1977113 00:09:15.055 Removing: /var/run/dpdk/spdk_pid1977395 00:09:15.055 Removing: /var/run/dpdk/spdk_pid1977552 00:09:15.055 Removing: /var/run/dpdk/spdk_pid1977752 00:09:15.055 Removing: /var/run/dpdk/spdk_pid1978056 00:09:15.055 Removing: /var/run/dpdk/spdk_pid1978078 00:09:15.055 Removing: /var/run/dpdk/spdk_pid1978195 00:09:15.055 Removing: /var/run/dpdk/spdk_pid1978408 00:09:15.055 Removing: /var/run/dpdk/spdk_pid1978691 00:09:15.055 Removing: /var/run/dpdk/spdk_pid1978957 00:09:15.055 Removing: /var/run/dpdk/spdk_pid1979244 00:09:15.055 Removing: /var/run/dpdk/spdk_pid1979493 00:09:15.055 Removing: /var/run/dpdk/spdk_pid1979687 00:09:15.055 Removing: /var/run/dpdk/spdk_pid1979848 00:09:15.055 Removing: /var/run/dpdk/spdk_pid1980115 00:09:15.055 Removing: /var/run/dpdk/spdk_pid1980384 00:09:15.055 Removing: /var/run/dpdk/spdk_pid1980666 00:09:15.055 Removing: /var/run/dpdk/spdk_pid1980940 00:09:15.055 Removing: /var/run/dpdk/spdk_pid1981188 00:09:15.055 Removing: /var/run/dpdk/spdk_pid1981326 00:09:15.055 Removing: /var/run/dpdk/spdk_pid1981531 00:09:15.055 Removing: /var/run/dpdk/spdk_pid1981800 00:09:15.055 Removing: /var/run/dpdk/spdk_pid1982083 00:09:15.055 Removing: /var/run/dpdk/spdk_pid1982349 00:09:15.055 Removing: /var/run/dpdk/spdk_pid1982638 00:09:15.055 Removing: /var/run/dpdk/spdk_pid1982828 00:09:15.055 Removing: /var/run/dpdk/spdk_pid1983010 00:09:15.055 Removing: /var/run/dpdk/spdk_pid1983208 00:09:15.055 Removing: /var/run/dpdk/spdk_pid1983498 00:09:15.055 Removing: /var/run/dpdk/spdk_pid1983766 00:09:15.055 Removing: /var/run/dpdk/spdk_pid1984048 00:09:15.055 Removing: /var/run/dpdk/spdk_pid1984322 00:09:15.055 Removing: /var/run/dpdk/spdk_pid1984510 00:09:15.055 Removing: /var/run/dpdk/spdk_pid1984649 00:09:15.055 Removing: /var/run/dpdk/spdk_pid1984912 00:09:15.055 Removing: /var/run/dpdk/spdk_pid1985185 00:09:15.055 Removing: /var/run/dpdk/spdk_pid1985486 00:09:15.055 Removing: /var/run/dpdk/spdk_pid1985752 00:09:15.055 Removing: /var/run/dpdk/spdk_pid1986035 00:09:15.055 Removing: /var/run/dpdk/spdk_pid1986183 00:09:15.055 Removing: /var/run/dpdk/spdk_pid1986371 00:09:15.055 Removing: /var/run/dpdk/spdk_pid1986624 00:09:15.055 Removing: /var/run/dpdk/spdk_pid1986914 00:09:15.055 Removing: /var/run/dpdk/spdk_pid1987182 00:09:15.055 Removing: /var/run/dpdk/spdk_pid1987463 00:09:15.055 Removing: /var/run/dpdk/spdk_pid1987715 00:09:15.055 Removing: /var/run/dpdk/spdk_pid1987905 00:09:15.055 Removing: /var/run/dpdk/spdk_pid1988082 00:09:15.055 Removing: /var/run/dpdk/spdk_pid1988270 00:09:15.055 Removing: /var/run/dpdk/spdk_pid1988888 00:09:15.055 Removing: /var/run/dpdk/spdk_pid1989298 00:09:15.055 Removing: /var/run/dpdk/spdk_pid1989786 00:09:15.055 Removing: /var/run/dpdk/spdk_pid1990382 00:09:15.055 Removing: /var/run/dpdk/spdk_pid1991106 00:09:15.055 Removing: /var/run/dpdk/spdk_pid1991646 00:09:15.055 Removing: /var/run/dpdk/spdk_pid1992092 00:09:15.055 Removing: /var/run/dpdk/spdk_pid1992476 00:09:15.055 Removing: /var/run/dpdk/spdk_pid1993019 00:09:15.055 Removing: /var/run/dpdk/spdk_pid1993370 00:09:15.055 Removing: /var/run/dpdk/spdk_pid1993852 00:09:15.055 Removing: /var/run/dpdk/spdk_pid1994384 00:09:15.055 Removing: /var/run/dpdk/spdk_pid1994687 00:09:15.055 Removing: /var/run/dpdk/spdk_pid1995223 00:09:15.055 Removing: /var/run/dpdk/spdk_pid1995622 00:09:15.055 Removing: /var/run/dpdk/spdk_pid1996056 00:09:15.055 Removing: /var/run/dpdk/spdk_pid1996593 00:09:15.055 Removing: /var/run/dpdk/spdk_pid1996906 00:09:15.055 Removing: /var/run/dpdk/spdk_pid1997426 00:09:15.055 Removing: /var/run/dpdk/spdk_pid1997902 00:09:15.055 Removing: /var/run/dpdk/spdk_pid1998261 00:09:15.055 Removing: /var/run/dpdk/spdk_pid1998798 00:09:15.055 Removing: /var/run/dpdk/spdk_pid1999196 00:09:15.055 Removing: /var/run/dpdk/spdk_pid1999629 00:09:15.055 Removing: /var/run/dpdk/spdk_pid2000173 00:09:15.055 Removing: /var/run/dpdk/spdk_pid2000682 00:09:15.055 Removing: /var/run/dpdk/spdk_pid2001076 00:09:15.055 Removing: /var/run/dpdk/spdk_pid2001625 00:09:15.056 Removing: /var/run/dpdk/spdk_pid2002163 00:09:15.056 Removing: /var/run/dpdk/spdk_pid2002634 00:09:15.056 Removing: /var/run/dpdk/spdk_pid2003020 00:09:15.056 Removing: /var/run/dpdk/spdk_pid2003549 00:09:15.056 Clean 00:09:15.056 killing process with pid 1919619 00:09:18.344 killing process with pid 1919616 00:09:18.344 killing process with pid 1919618 00:09:18.344 killing process with pid 1919617 00:09:18.344 10:40:34 -- common/autotest_common.sh@1436 -- # return 0 00:09:18.344 10:40:34 -- spdk/autotest.sh@387 -- # timing_exit post_cleanup 00:09:18.344 10:40:34 -- common/autotest_common.sh@718 -- # xtrace_disable 00:09:18.344 10:40:34 -- common/autotest_common.sh@10 -- # set +x 00:09:18.344 10:40:34 -- spdk/autotest.sh@389 -- # timing_exit autotest 00:09:18.344 10:40:34 -- common/autotest_common.sh@718 -- # xtrace_disable 00:09:18.344 10:40:34 -- common/autotest_common.sh@10 -- # set +x 00:09:18.344 10:40:34 -- spdk/autotest.sh@390 -- # chmod a+r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/timing.txt 00:09:18.344 10:40:34 -- spdk/autotest.sh@392 -- # [[ -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/udev.log ]] 00:09:18.344 10:40:34 -- spdk/autotest.sh@392 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/udev.log 00:09:18.344 10:40:34 -- spdk/autotest.sh@394 -- # hash lcov 00:09:18.344 10:40:34 -- spdk/autotest.sh@394 -- # [[ CC_TYPE=clang == *\c\l\a\n\g* ]] 00:09:18.344 10:40:34 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:09:18.344 10:40:34 -- scripts/common.sh@433 -- $ [[ -e /bin/wpdk_common.sh ]] 00:09:18.344 10:40:34 -- scripts/common.sh@441 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:09:18.344 10:40:34 -- scripts/common.sh@442 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:09:18.344 10:40:34 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:18.344 10:40:34 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:18.344 10:40:34 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:18.344 10:40:34 -- paths/export.sh@5 -- $ export PATH 00:09:18.344 10:40:34 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:18.344 10:40:34 -- common/autobuild_common.sh@434 -- $ out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:09:18.344 10:40:34 -- common/autobuild_common.sh@435 -- $ date +%s 00:09:18.344 10:40:34 -- common/autobuild_common.sh@435 -- $ mktemp -dt spdk_1720860034.XXXXXX 00:09:18.344 10:40:34 -- common/autobuild_common.sh@435 -- $ SPDK_WORKSPACE=/tmp/spdk_1720860034.VSc8Bt 00:09:18.344 10:40:34 -- common/autobuild_common.sh@437 -- $ [[ -n '' ]] 00:09:18.344 10:40:34 -- common/autobuild_common.sh@441 -- $ '[' -n v23.11 ']' 00:09:18.344 10:40:34 -- common/autobuild_common.sh@442 -- $ dirname /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:09:18.603 10:40:34 -- common/autobuild_common.sh@442 -- $ scanbuild_exclude=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk' 00:09:18.603 10:40:34 -- common/autobuild_common.sh@448 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp' 00:09:18.603 10:40:34 -- common/autobuild_common.sh@450 -- $ scanbuild='scan-build -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:09:18.603 10:40:34 -- common/autobuild_common.sh@451 -- $ get_config_params 00:09:18.603 10:40:34 -- common/autotest_common.sh@387 -- $ xtrace_disable 00:09:18.603 10:40:34 -- common/autotest_common.sh@10 -- $ set +x 00:09:18.603 10:40:34 -- common/autobuild_common.sh@451 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user' 00:09:18.603 10:40:34 -- spdk/autopackage.sh@10 -- $ MAKEFLAGS=-j112 00:09:18.603 10:40:34 -- spdk/autopackage.sh@11 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:09:18.603 10:40:34 -- spdk/autopackage.sh@13 -- $ [[ 0 -eq 1 ]] 00:09:18.603 10:40:34 -- spdk/autopackage.sh@18 -- $ [[ 1 -eq 0 ]] 00:09:18.603 10:40:34 -- spdk/autopackage.sh@18 -- $ [[ 0 -eq 0 ]] 00:09:18.603 10:40:34 -- spdk/autopackage.sh@19 -- $ timing_finish 00:09:18.603 10:40:34 -- common/autotest_common.sh@724 -- $ flamegraph=/usr/local/FlameGraph/flamegraph.pl 00:09:18.603 10:40:34 -- common/autotest_common.sh@725 -- $ '[' -x /usr/local/FlameGraph/flamegraph.pl ']' 00:09:18.603 10:40:34 -- common/autotest_common.sh@727 -- $ /usr/local/FlameGraph/flamegraph.pl --title 'Build Timing' --nametype Step: --countname seconds /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/timing.txt 00:09:18.603 10:40:34 -- spdk/autopackage.sh@20 -- $ exit 0 00:09:18.603 + [[ -n 1863883 ]] 00:09:18.603 + sudo kill 1863883 00:09:18.614 [Pipeline] } 00:09:18.635 [Pipeline] // stage 00:09:18.639 [Pipeline] } 00:09:18.659 [Pipeline] // timeout 00:09:18.664 [Pipeline] } 00:09:18.683 [Pipeline] // catchError 00:09:18.687 [Pipeline] } 00:09:18.707 [Pipeline] // wrap 00:09:18.712 [Pipeline] } 00:09:18.729 [Pipeline] // catchError 00:09:18.738 [Pipeline] stage 00:09:18.741 [Pipeline] { (Epilogue) 00:09:18.759 [Pipeline] catchError 00:09:18.761 [Pipeline] { 00:09:18.775 [Pipeline] echo 00:09:18.777 Cleanup processes 00:09:18.782 [Pipeline] sh 00:09:19.065 + sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:09:19.065 2012344 sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:09:19.078 [Pipeline] sh 00:09:19.357 ++ sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:09:19.357 ++ grep -v 'sudo pgrep' 00:09:19.357 ++ awk '{print $1}' 00:09:19.357 + sudo kill -9 00:09:19.357 + true 00:09:19.369 [Pipeline] sh 00:09:19.651 + jbp/jenkins/jjb-config/jobs/scripts/compress_artifacts.sh 00:09:19.651 xz: Reduced the number of threads from 112 to 89 to not exceed the memory usage limit of 14,721 MiB 00:09:19.651 xz: Reduced the number of threads from 112 to 89 to not exceed the memory usage limit of 14,721 MiB 00:09:20.599 [Pipeline] sh 00:09:20.916 + jbp/jenkins/jjb-config/jobs/scripts/check_artifacts_size.sh 00:09:20.916 Artifacts sizes are good 00:09:20.939 [Pipeline] archiveArtifacts 00:09:20.946 Archiving artifacts 00:09:21.000 [Pipeline] sh 00:09:21.282 + sudo chown -R sys_sgci /var/jenkins/workspace/short-fuzz-phy-autotest 00:09:21.297 [Pipeline] cleanWs 00:09:21.307 [WS-CLEANUP] Deleting project workspace... 00:09:21.307 [WS-CLEANUP] Deferred wipeout is used... 00:09:21.313 [WS-CLEANUP] done 00:09:21.315 [Pipeline] } 00:09:21.337 [Pipeline] // catchError 00:09:21.350 [Pipeline] sh 00:09:21.629 + logger -p user.info -t JENKINS-CI 00:09:21.640 [Pipeline] } 00:09:21.660 [Pipeline] // stage 00:09:21.666 [Pipeline] } 00:09:21.686 [Pipeline] // node 00:09:21.694 [Pipeline] End of Pipeline 00:09:21.844 Finished: SUCCESS